Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 6B98C111AB for ; Mon, 14 Apr 2014 07:49:26 +0000 (UTC) Received: (qmail 27182 invoked by uid 500); 14 Apr 2014 07:49:14 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 26622 invoked by uid 500); 14 Apr 2014 07:49:13 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 26615 invoked by uid 99); 14 Apr 2014 07:49:12 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 14 Apr 2014 07:49:12 +0000 X-ASF-Spam-Status: No, hits=2.2 required=5.0 tests=HTML_MESSAGE,NORMAL_HTTP_TO_IP,RCVD_IN_DNSWL_NONE,SPF_PASS,WEIRD_PORT X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of gwang@gopivotal.com designates 209.85.192.54 as permitted sender) Received: from [209.85.192.54] (HELO mail-qg0-f54.google.com) (209.85.192.54) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 14 Apr 2014 07:49:08 +0000 Received: by mail-qg0-f54.google.com with SMTP id a108so7909999qge.13 for ; Mon, 14 Apr 2014 00:48:47 -0700 (PDT) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:in-reply-to:references:from:date :message-id:subject:to:content-type; bh=+EuOwkED6EG2xkKGPR2w42k+L+cuQK6N46IPW/87uRc=; b=gRlVCj6JJ5LhG+mP1AtVNUpvUbjvq1mJtmSD3KHzj222Iln8TxJIsXSIFNW+3VByGt TH76GMCaxVV4Do6jFccvk+RyXxuK9XSf6vt9lZCjuM91dGEKyEwWcgyojHGKyBqDKeMC IonA7/xtrewqm1Ve0UF5UF46MhYky3m9eqj3G//ujzPRI9SHNWEHWCDNMdtUQa5CgYD8 GqWYZz8yPdjamzZkhSI/BcoS/KkRxkZA26A2R3jdIWbJGLqcHbU6P1ToU4SN8NwDX4h9 4ABM8vwTcUOWYLaGZz7g5imlbQGuUsEkcb7rbtE4zgumjEHuyVwOM8WH1ve/XDEoKjgO tIRw== X-Gm-Message-State: ALoCoQnjC1gA9E3+ykeN9lL3pj+5pHS7OnBymNUsERNQ2Y1ZzOkGCwu5CJ9Sw47v3p6mut6rhFua X-Received: by 10.224.172.131 with SMTP id l3mr17594423qaz.57.1397461727038; Mon, 14 Apr 2014 00:48:47 -0700 (PDT) MIME-Version: 1.0 Received: by 10.229.106.132 with HTTP; Mon, 14 Apr 2014 00:48:26 -0700 (PDT) In-Reply-To: References: From: Gordon Wang Date: Mon, 14 Apr 2014 15:48:26 +0800 Message-ID: Subject: Re: hadoop inconsistent behaviour To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=047d7b673fb6d3e44104f6fbe981 X-Virus-Checked: Checked by ClamAV on apache.org --047d7b673fb6d3e44104f6fbe981 Content-Type: text/plain; charset=ISO-8859-1 You can find the reduce container from RM's web page. BTW: from above log, you can check if application master crashes. On Mon, Apr 14, 2014 at 3:12 PM, Rahul Singh wrote: > how do i identify an reduce container? there are multiple container dirs > in my application id folder in userlogs. > > > On Mon, Apr 14, 2014 at 12:29 PM, Gordon Wang wrote: > >> Hi Rahul, >> >> What is the log of reduce container ? Please paste the log and we can see >> the reason. >> >> >> On Mon, Apr 14, 2014 at 2:38 PM, Rahul Singh wrote: >> >>> Hi, >>> I am running a job(wordcount example) on 3 node cluster(1 master and 2 >>> slave), some times the job passes but some times it fails(as reduce fails, >>> input data few kbs). >>> I am not able to nail down the reason of this inconsistency. >>> >>> >>> >>> failed log: >>> >>> 14/04/14 11:57:24 WARN util.NativeCodeLoader: Unable to load >>> native-hadoop library for your platform... using builtin-java classes where >>> applicable >>> 14/04/14 11:57:25 INFO client.RMProxy: Connecting to ResourceManager at / >>> 20.0.1.206:8032 >>> 14/04/14 11:57:26 INFO input.FileInputFormat: Total input paths to >>> process : 1 >>> 14/04/14 11:57:26 INFO mapreduce.JobSubmitter: number of splits:1 >>> 14/04/14 11:57:26 INFO mapreduce.JobSubmitter: Submitting tokens for >>> job: job_1397454060494_0003 >>> 14/04/14 11:57:26 INFO impl.YarnClientImpl: Submitted application >>> application_1397454060494_0003 >>> 14/04/14 11:57:26 INFO mapreduce.Job: The url to track the job: >>> http://20.0.1.206:8088/proxy/application_1397454060494_0003/ >>> 14/04/14 11:57:26 INFO mapreduce.Job: Running job: job_1397454060494_0003 >>> 14/04/14 11:57:34 INFO mapreduce.Job: Job job_1397454060494_0003 running >>> in uber mode : false >>> 14/04/14 11:57:34 INFO mapreduce.Job: map 0% reduce 0% >>> 14/04/14 11:57:40 INFO mapreduce.Job: map 100% reduce 0% >>> 14/04/14 11:57:46 INFO mapreduce.Job: map 100% reduce 13% >>> 14/04/14 11:57:48 INFO mapreduce.Job: map 100% reduce 25% >>> 14/04/14 11:57:49 INFO mapreduce.Job: map 100% reduce 38% >>> 14/04/14 11:57:50 INFO mapreduce.Job: map 100% reduce 50% >>> 14/04/14 11:57:54 INFO mapreduce.Job: Task Id : >>> attempt_1397454060494_0003_r_000003_0, Status : FAILED >>> 14/04/14 11:57:54 INFO mapreduce.Job: Task Id : >>> attempt_1397454060494_0003_r_000001_0, Status : FAILED >>> 14/04/14 11:57:56 INFO mapreduce.Job: Task Id : >>> attempt_1397454060494_0003_r_000005_0, Status : FAILED >>> 14/04/14 11:57:56 INFO mapreduce.Job: Task Id : >>> attempt_1397454060494_0003_r_000007_0, Status : FAILED >>> 14/04/14 11:58:02 INFO mapreduce.Job: map 100% reduce 63% >>> 14/04/14 11:58:04 INFO mapreduce.Job: map 100% reduce 75% >>> 14/04/14 11:58:09 INFO mapreduce.Job: Task Id : >>> attempt_1397454060494_0003_r_000003_1, Status : FAILED >>> 14/04/14 11:58:11 INFO mapreduce.Job: Task Id : >>> attempt_1397454060494_0003_r_000005_1, Status : FAILED >>> 14/04/14 11:58:24 INFO mapreduce.Job: Task Id : >>> attempt_1397454060494_0003_r_000003_2, Status : FAILED >>> 14/04/14 11:58:26 INFO mapreduce.Job: Task Id : >>> attempt_1397454060494_0003_r_000005_2, Status : FAILED >>> 14/04/14 11:58:40 INFO mapreduce.Job: map 100% reduce 100% >>> 14/04/14 11:58:40 INFO mapreduce.Job: Job job_1397454060494_0003 failed >>> with state FAILED due to: Task failed task_1397454060494_0003_r_000003 >>> Job failed as tasks failed. failedMaps:0 failedReduces:1 >>> >>> 14/04/14 11:58:40 INFO mapreduce.Job: Counters: 51 >>> File System Counters >>> FILE: Number of bytes read=80 >>> FILE: Number of bytes written=596766 >>> FILE: Number of read operations=0 >>> FILE: Number of large read operations=0 >>> FILE: Number of write operations=0 >>> HDFS: Number of bytes read=175 >>> HDFS: Number of bytes written=28 >>> HDFS: Number of read operations=21 >>> HDFS: Number of large read operations=0 >>> HDFS: Number of write operations=12 >>> Job Counters >>> Failed reduce tasks=9 >>> Killed reduce tasks=1 >>> Launched map tasks=1 >>> Launched reduce tasks=16 >>> Data-local map tasks=1 >>> Total time spent by all maps in occupied slots (ms)=3477 >>> Total time spent by all reduces in occupied slots (ms)=148867 >>> Total time spent by all map tasks (ms)=3477 >>> Total time spent by all reduce tasks (ms)=148867 >>> Total vcore-seconds taken by all map tasks=3477 >>> Total vcore-seconds taken by all reduce tasks=148867 >>> Total megabyte-seconds taken by all map tasks=3560448 >>> Total megabyte-seconds taken by all reduce tasks=152439808 >>> Map-Reduce Framework >>> Map input records=3 >>> Map output records=13 >>> Map output bytes=110 >>> Map output materialized bytes=112 >>> Input split bytes=117 >>> Combine input records=13 >>> Combine output records=6 >>> Reduce input groups=4 >>> Reduce shuffle bytes=80 >>> Reduce input records=4 >>> Reduce output records=4 >>> Spilled Records=10 >>> Shuffled Maps =6 >>> Failed Shuffles=0 >>> Merged Map outputs=6 >>> GC time elapsed (ms)=142 >>> CPU time spent (ms)=6420 >>> Physical memory (bytes) snapshot=1100853248 >>> Virtual memory (bytes) snapshot=4468314112 >>> Total committed heap usage (bytes)=1406992384 >>> Shuffle Errors >>> BAD_ID=0 >>> CONNECTION=0 >>> IO_ERROR=0 >>> WRONG_LENGTH=0 >>> WRONG_MAP=0 >>> WRONG_REDUCE=0 >>> File Input Format Counters >>> Bytes Read=58 >>> File Output Format Counters >>> Bytes Written=28 >>> >>> Job Passing Logs: >>> hadoop jar ../share/hadoop/mapreduce/hadoop-mapreduce-examples-2.3.0.jar >>> wordcount /user/hduser/input /user/hduser/output_wordcount9 >>> 14/04/14 11:47:27 WARN util.NativeCodeLoader: Unable to load >>> native-hadoop library for your platform... using builtin-java classes where >>> applicable >>> 14/04/14 11:47:28 INFO client.RMProxy: Connecting to ResourceManager at / >>> 20.0.1.206:8032 >>> 14/04/14 11:47:28 INFO input.FileInputFormat: Total input paths to >>> process : 1 >>> 14/04/14 11:47:29 INFO mapreduce.JobSubmitter: number of splits:1 >>> 14/04/14 11:47:29 INFO mapreduce.JobSubmitter: Submitting tokens for >>> job: job_1397454060494_0002 >>> 14/04/14 11:47:29 INFO impl.YarnClientImpl: Submitted application >>> application_1397454060494_0002 >>> 14/04/14 11:47:29 INFO mapreduce.Job: The url to track the job: >>> http://20.0.1.206:8088/proxy/application_1397454060494_0002/ >>> 14/04/14 11:47:29 INFO mapreduce.Job: Running job: job_1397454060494_0002 >>> 14/04/14 11:47:36 INFO mapreduce.Job: Job job_1397454060494_0002 running >>> in uber mode : false >>> 14/04/14 11:47:36 INFO mapreduce.Job: map 0% reduce 0% >>> 14/04/14 11:47:50 INFO mapreduce.Job: Task Id : >>> attempt_1397454060494_0002_m_000000_0, Status : FAILED >>> 14/04/14 11:48:05 INFO mapreduce.Job: Task Id : >>> attempt_1397454060494_0002_m_000000_1, Status : FAILED >>> 14/04/14 11:48:20 INFO mapreduce.Job: Task Id : >>> attempt_1397454060494_0002_m_000000_2, Status : FAILED >>> 14/04/14 11:48:26 INFO mapreduce.Job: map 100% reduce 0% >>> 14/04/14 11:48:34 INFO mapreduce.Job: map 100% reduce 13% >>> 14/04/14 11:48:35 INFO mapreduce.Job: map 100% reduce 25% >>> 14/04/14 11:48:37 INFO mapreduce.Job: map 100% reduce 50% >>> 14/04/14 11:48:41 INFO mapreduce.Job: Task Id : >>> attempt_1397454060494_0002_r_000001_0, Status : FAILED >>> 14/04/14 11:48:42 INFO mapreduce.Job: Task Id : >>> attempt_1397454060494_0002_r_000003_0, Status : FAILED >>> 14/04/14 11:48:43 INFO mapreduce.Job: Task Id : >>> attempt_1397454060494_0002_r_000005_0, Status : FAILED >>> 14/04/14 11:48:44 INFO mapreduce.Job: Task Id : >>> attempt_1397454060494_0002_r_000007_0, Status : FAILED >>> 14/04/14 11:48:50 INFO mapreduce.Job: map 100% reduce 63% >>> 14/04/14 11:48:51 INFO mapreduce.Job: map 100% reduce 75% >>> 14/04/14 11:48:52 INFO mapreduce.Job: map 100% reduce 88% >>> 14/04/14 11:48:58 INFO mapreduce.Job: Task Id : >>> attempt_1397454060494_0002_r_000005_1, Status : FAILED >>> 14/04/14 11:49:05 INFO mapreduce.Job: map 100% reduce 100% >>> 14/04/14 11:49:06 INFO mapreduce.Job: Job job_1397454060494_0002 >>> completed successfully >>> 14/04/14 11:49:06 INFO mapreduce.Job: Counters: 52 >>> File System Counters >>> FILE: Number of bytes read=112 >>> FILE: Number of bytes written=767175 >>> FILE: Number of read operations=0 >>> FILE: Number of large read operations=0 >>> FILE: Number of write operations=0 >>> HDFS: Number of bytes read=175 >>> HDFS: Number of bytes written=40 >>> HDFS: Number of read operations=27 >>> HDFS: Number of large read operations=0 >>> HDFS: Number of write operations=16 >>> Job Counters >>> Failed map tasks=3 >>> Failed reduce tasks=5 >>> Launched map tasks=4 >>> Launched reduce tasks=13 >>> Other local map tasks=3 >>> Data-local map tasks=1 >>> Total time spent by all maps in occupied slots (ms)=41629 >>> Total time spent by all reduces in occupied slots (ms)=104530 >>> Total time spent by all map tasks (ms)=41629 >>> Total time spent by all reduce tasks (ms)=104530 >>> Total vcore-seconds taken by all map tasks=41629 >>> Total vcore-seconds taken by all reduce tasks=104530 >>> Total megabyte-seconds taken by all map tasks=42628096 >>> Total megabyte-seconds taken by all reduce tasks=107038720 >>> Map-Reduce Framework >>> Map input records=3 >>> Map output records=13 >>> Map output bytes=110 >>> Map output materialized bytes=112 >>> Input split bytes=117 >>> Combine input records=13 >>> Combine output records=6 >>> Reduce input groups=6 >>> Reduce shuffle bytes=112 >>> Reduce input records=6 >>> Reduce output records=6 >>> Spilled Records=12 >>> Shuffled Maps =8 >>> Failed Shuffles=0 >>> Merged Map outputs=8 >>> GC time elapsed (ms)=186 >>> CPU time spent (ms)=8890 >>> Physical memory (bytes) snapshot=1408913408 >>> Virtual memory (bytes) snapshot=5727019008 >>> Total committed heap usage (bytes)=1808990208 >>> Shuffle Errors >>> BAD_ID=0 >>> CONNECTION=0 >>> IO_ERROR=0 >>> WRONG_LENGTH=0 >>> WRONG_MAP=0 >>> WRONG_REDUCE=0 >>> File Input Format Counters >>> Bytes Read=58 >>> File Output Format Counters >>> Bytes Written=40 >>> >>> Thanks and Regards, >>> -Rahul Singh >>> >> >> >> >> -- >> Regards >> Gordon Wang >> > > -- Regards Gordon Wang --047d7b673fb6d3e44104f6fbe981 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
You can find the reduce container from RM's web page.<= div>BTW: from above log, you can check if application master crashes.=A0


On M= on, Apr 14, 2014 at 3:12 PM, Rahul Singh <smart.rahul.iiit@gmail.= com> wrote:
how do i identify an reduce= container? there are multiple container dirs in my application id folder i= n userlogs.

On Mon, Apr 14, 2014 at 12:29 PM, Gordon W= ang <gwang@gopivotal.com> wrote:
Hi Rahul,

What is the log of reduce container ? Please paste the log and we can see= the reason.


On= Mon, Apr 14, 2014 at 2:38 PM, Rahul Singh <smart.rahul.iiit@gmai= l.com> wrote:
<= div>
Hi,
=A0 I am running a job(wordcount example) o= n 3 node cluster(1 master and 2 slave), some times the job passes but some = times it fails(as reduce fails, input data few kbs).
=A0 I am not able to nail down the reason of this inconsistency.
=


failed log:

14/04/14 11:57:24 WARN util.NativeCode= Loader: Unable to load native-hadoop library for your platform... using bui= ltin-java classes where applicable
14/04/14 11:57:25 INFO client.RMProxy: Connecting to ResourceManager at /20.0.1.206:8032
1= 4/04/14 11:57:26 INFO input.FileInputFormat: Total input paths to process := 1
14/04/14 11:57:26 INFO mapreduce.JobSubmitter: number of splits:1
14/04/14 11:57:26 INFO mapreduce.JobSubmitter: Submitting tokens for job: j= ob_1397454060494_0003
14/04/14 11:57:26 INFO impl.YarnClientImpl: Submit= ted application application_1397454060494_0003
14/04/14 11:57:26 INFO ma= preduce.Job: The url to track the job: http://20.0.1.206:80= 88/proxy/application_1397454060494_0003/
14/04/14 11:57:26 INFO mapreduce.Job: Running job: job_1397454060494_000314/04/14 11:57:34 INFO mapreduce.Job: Job job_1397454060494_0003 running = in uber mode : false
14/04/14 11:57:34 INFO mapreduce.Job:=A0 map 0% red= uce 0%
14/04/14 11:57:40 INFO mapreduce.Job:=A0 map 100% reduce 0%
14/04/14 11:= 57:46 INFO mapreduce.Job:=A0 map 100% reduce 13%
14/04/14 11:57:48 INFO = mapreduce.Job:=A0 map 100% reduce 25%
14/04/14 11:57:49 INFO mapreduce.J= ob:=A0 map 100% reduce 38%
14/04/14 11:57:50 INFO mapreduce.Job:=A0 map 100% reduce 50%
14/04/14 11= :57:54 INFO mapreduce.Job: Task Id : attempt_1397454060494_0003_r_000003_0,= Status : FAILED
14/04/14 11:57:54 INFO mapreduce.Job: Task Id : attempt= _1397454060494_0003_r_000001_0, Status : FAILED
14/04/14 11:57:56 INFO mapreduce.Job: Task Id : attempt_1397454060494_0003_= r_000005_0, Status : FAILED
14/04/14 11:57:56 INFO mapreduce.Job: Task I= d : attempt_1397454060494_0003_r_000007_0, Status : FAILED
14/04/14 11:5= 8:02 INFO mapreduce.Job:=A0 map 100% reduce 63%
14/04/14 11:58:04 INFO mapreduce.Job:=A0 map 100% reduce 75%
14/04/14 11= :58:09 INFO mapreduce.Job: Task Id : attempt_1397454060494_0003_r_000003_1,= Status : FAILED
14/04/14 11:58:11 INFO mapreduce.Job: Task Id : attempt= _1397454060494_0003_r_000005_1, Status : FAILED
14/04/14 11:58:24 INFO mapreduce.Job: Task Id : attempt_1397454060494_0003_= r_000003_2, Status : FAILED
14/04/14 11:58:26 INFO mapreduce.Job: Task I= d : attempt_1397454060494_0003_r_000005_2, Status : FAILED
14/04/14 11:5= 8:40 INFO mapreduce.Job:=A0 map 100% reduce 100%
14/04/14 11:58:40 INFO mapreduce.Job: Job job_1397454060494_0003 failed wit= h state FAILED due to: Task failed task_1397454060494_0003_r_000003
Job = failed as tasks failed. failedMaps:0 failedReduces:1

14/04/14 11:58:= 40 INFO mapreduce.Job: Counters: 51
=A0=A0=A0 File System Counters
=A0=A0=A0 =A0=A0=A0 FILE: Number of bytes= read=3D80
=A0=A0=A0 =A0=A0=A0 FILE: Number of bytes written=3D596766=A0=A0=A0 =A0=A0=A0 FILE: Number of read operations=3D0
=A0=A0=A0 =A0= =A0=A0 FILE: Number of large read operations=3D0
=A0=A0=A0 =A0=A0=A0 FIL= E: Number of write operations=3D0
=A0=A0=A0 =A0=A0=A0 HDFS: Number of bytes read=3D175
=A0=A0=A0 =A0=A0=A0= HDFS: Number of bytes written=3D28
=A0=A0=A0 =A0=A0=A0 HDFS: Number of = read operations=3D21
=A0=A0=A0 =A0=A0=A0 HDFS: Number of large read oper= ations=3D0
=A0=A0=A0 =A0=A0=A0 HDFS: Number of write operations=3D12
=A0=A0=A0 Job Counters
=A0=A0=A0 =A0=A0=A0 Failed reduce tasks=3D9
= =A0=A0=A0 =A0=A0=A0 Killed reduce tasks=3D1
=A0=A0=A0 =A0=A0=A0 Launched= map tasks=3D1
=A0=A0=A0 =A0=A0=A0 Launched reduce tasks=3D16
=A0=A0= =A0 =A0=A0=A0 Data-local map tasks=3D1
=A0=A0=A0 =A0=A0=A0 Total time sp= ent by all maps in occupied slots (ms)=3D3477
=A0=A0=A0 =A0=A0=A0 Total time spent by all reduces in occupied slots (ms)= =3D148867
=A0=A0=A0 =A0=A0=A0 Total time spent by all map tasks (ms)=3D3= 477
=A0=A0=A0 =A0=A0=A0 Total time spent by all reduce tasks (ms)=3D1488= 67
=A0=A0=A0 =A0=A0=A0 Total vcore-seconds taken by all map tasks=3D3477=
=A0=A0=A0 =A0=A0=A0 Total vcore-seconds taken by all reduce tasks=3D148867<= br>=A0=A0=A0 =A0=A0=A0 Total megabyte-seconds taken by all map tasks=3D3560= 448
=A0=A0=A0 =A0=A0=A0 Total megabyte-seconds taken by all reduce tasks= =3D152439808
=A0=A0=A0 Map-Reduce Framework
=A0=A0=A0 =A0=A0=A0 Map input records=3D3
=A0=A0=A0 =A0=A0=A0 Map output= records=3D13
=A0=A0=A0 =A0=A0=A0 Map output bytes=3D110
=A0=A0=A0 = =A0=A0=A0 Map output materialized bytes=3D112
=A0=A0=A0 =A0=A0=A0 Input = split bytes=3D117
=A0=A0=A0 =A0=A0=A0 Combine input records=3D13
=A0= =A0=A0 =A0=A0=A0 Combine output records=3D6
=A0=A0=A0 =A0=A0=A0 Reduce input groups=3D4
=A0=A0=A0 =A0=A0=A0 Reduce s= huffle bytes=3D80
=A0=A0=A0 =A0=A0=A0 Reduce input records=3D4
=A0=A0= =A0 =A0=A0=A0 Reduce output records=3D4
=A0=A0=A0 =A0=A0=A0 Spilled Reco= rds=3D10
=A0=A0=A0 =A0=A0=A0 Shuffled Maps =3D6
=A0=A0=A0 =A0=A0=A0 F= ailed Shuffles=3D0
=A0=A0=A0 =A0=A0=A0 Merged Map outputs=3D6
=A0=A0=A0 =A0=A0=A0 GC time e= lapsed (ms)=3D142
=A0=A0=A0 =A0=A0=A0 CPU time spent (ms)=3D6420
=A0= =A0=A0 =A0=A0=A0 Physical memory (bytes) snapshot=3D1100853248
=A0=A0=A0= =A0=A0=A0 Virtual memory (bytes) snapshot=3D4468314112
=A0=A0=A0 =A0=A0= =A0 Total committed heap usage (bytes)=3D1406992384
=A0=A0=A0 Shuffle Errors
=A0=A0=A0 =A0=A0=A0 BAD_ID=3D0
=A0=A0=A0 =A0= =A0=A0 CONNECTION=3D0
=A0=A0=A0 =A0=A0=A0 IO_ERROR=3D0
=A0=A0=A0 =A0= =A0=A0 WRONG_LENGTH=3D0
=A0=A0=A0 =A0=A0=A0 WRONG_MAP=3D0
=A0=A0=A0 = =A0=A0=A0 WRONG_REDUCE=3D0
=A0=A0=A0 File Input Format Counters
=A0= =A0=A0 =A0=A0=A0 Bytes Read=3D58
=A0=A0=A0 File Output Format Counters
=A0=A0=A0 =A0=A0=A0 Bytes Written= =3D28

Job Passing Logs:
hadoop jar ../share/hadoop/mapreduc= e/hadoop-mapreduce-examples-2.3.0.jar wordcount /user/hduser/input /user/hd= user/output_wordcount9
14/04/14 11:47:27 WARN util.NativeCodeLoader: Unable to load native-hadoop = library for your platform... using builtin-java classes where applicable14/04/14 11:47:28 INFO client.RMProxy: Connecting to ResourceManager at /<= a href=3D"http://20.0.1.206:8032" target=3D"_blank">20.0.1.206:8032
14/04/14 11:47:28 INFO input.FileInputFormat: Total input paths to process = : 1
14/04/14 11:47:29 INFO mapreduce.JobSubmitter: number of splits:114/04/14 11:47:29 INFO mapreduce.JobSubmitter: Submitting tokens for job: = job_1397454060494_0002
14/04/14 11:47:29 INFO impl.YarnClientImpl: Submitted application applicati= on_1397454060494_0002
14/04/14 11:47:29 INFO mapreduce.Job: The url to t= rack the job: http://20.0.1.206:8088/proxy/application_1397= 454060494_0002/
14/04/14 11:47:29 INFO mapreduce.Job: Running job: job_1397454060494_000214/04/14 11:47:36 INFO mapreduce.Job: Job job_1397454060494_0002 running = in uber mode : false
14/04/14 11:47:36 INFO mapreduce.Job:=A0 map 0% red= uce 0%
14/04/14 11:47:50 INFO mapreduce.Job: Task Id : attempt_1397454060494_0002_= m_000000_0, Status : FAILED
14/04/14 11:48:05 INFO mapreduce.Job: Task I= d : attempt_1397454060494_0002_m_000000_1, Status : FAILED
14/04/14 11:4= 8:20 INFO mapreduce.Job: Task Id : attempt_1397454060494_0002_m_000000_2, S= tatus : FAILED
14/04/14 11:48:26 INFO mapreduce.Job:=A0 map 100% reduce 0%
14/04/14 11:= 48:34 INFO mapreduce.Job:=A0 map 100% reduce 13%
14/04/14 11:48:35 INFO = mapreduce.Job:=A0 map 100% reduce 25%
14/04/14 11:48:37 INFO mapreduce.J= ob:=A0 map 100% reduce 50%
14/04/14 11:48:41 INFO mapreduce.Job: Task Id : attempt_1397454060494_0002_= r_000001_0, Status : FAILED
14/04/14 11:48:42 INFO mapreduce.Job: Task I= d : attempt_1397454060494_0002_r_000003_0, Status : FAILED
14/04/14 11:4= 8:43 INFO mapreduce.Job: Task Id : attempt_1397454060494_0002_r_000005_0, S= tatus : FAILED
14/04/14 11:48:44 INFO mapreduce.Job: Task Id : attempt_1397454060494_0002_= r_000007_0, Status : FAILED
14/04/14 11:48:50 INFO mapreduce.Job:=A0 map= 100% reduce 63%
14/04/14 11:48:51 INFO mapreduce.Job:=A0 map 100% reduc= e 75%
14/04/14 11:48:52 INFO mapreduce.Job:=A0 map 100% reduce 88%
14/04/14 11= :48:58 INFO mapreduce.Job: Task Id : attempt_1397454060494_0002_r_000005_1,= Status : FAILED
14/04/14 11:49:05 INFO mapreduce.Job:=A0 map 100% reduc= e 100%
14/04/14 11:49:06 INFO mapreduce.Job: Job job_1397454060494_0002 completed = successfully
14/04/14 11:49:06 INFO mapreduce.Job: Counters: 52
=A0= =A0=A0 File System Counters
=A0=A0=A0 =A0=A0=A0 FILE: Number of bytes re= ad=3D112
=A0=A0=A0 =A0=A0=A0 FILE: Number of bytes written=3D767175
=A0=A0=A0 =A0=A0=A0 FILE: Number of read operations=3D0
=A0=A0=A0 =A0=A0= =A0 FILE: Number of large read operations=3D0
=A0=A0=A0 =A0=A0=A0 FILE: = Number of write operations=3D0
=A0=A0=A0 =A0=A0=A0 HDFS: Number of bytes= read=3D175
=A0=A0=A0 =A0=A0=A0 HDFS: Number of bytes written=3D40
=A0=A0=A0 =A0=A0=A0 HDFS: Number of read operations=3D27
=A0=A0=A0 =A0= =A0=A0 HDFS: Number of large read operations=3D0
=A0=A0=A0 =A0=A0=A0 HDF= S: Number of write operations=3D16
=A0=A0=A0 Job Counters
=A0=A0=A0 = =A0=A0=A0 Failed map tasks=3D3
=A0=A0=A0 =A0=A0=A0 Failed reduce tasks= =3D5
=A0=A0=A0 =A0=A0=A0 Launched map tasks=3D4
=A0=A0=A0 =A0=A0=A0 Launched = reduce tasks=3D13
=A0=A0=A0 =A0=A0=A0 Other local map tasks=3D3
=A0= =A0=A0 =A0=A0=A0 Data-local map tasks=3D1
=A0=A0=A0 =A0=A0=A0 Total time= spent by all maps in occupied slots (ms)=3D41629
=A0=A0=A0 =A0=A0=A0 To= tal time spent by all reduces in occupied slots (ms)=3D104530
=A0=A0=A0 =A0=A0=A0 Total time spent by all map tasks (ms)=3D41629
=A0= =A0=A0 =A0=A0=A0 Total time spent by all reduce tasks (ms)=3D104530
=A0= =A0=A0 =A0=A0=A0 Total vcore-seconds taken by all map tasks=3D41629
=A0= =A0=A0 =A0=A0=A0 Total vcore-seconds taken by all reduce tasks=3D104530
=A0=A0=A0 =A0=A0=A0 Total megabyte-seconds taken by all map tasks=3D4262809= 6
=A0=A0=A0 =A0=A0=A0 Total megabyte-seconds taken by all reduce tasks= =3D107038720
=A0=A0=A0 Map-Reduce Framework
=A0=A0=A0 =A0=A0=A0 Map i= nput records=3D3
=A0=A0=A0 =A0=A0=A0 Map output records=3D13
=A0=A0=A0 =A0=A0=A0 Map output bytes=3D110
=A0=A0=A0 =A0=A0=A0 Map outpu= t materialized bytes=3D112
=A0=A0=A0 =A0=A0=A0 Input split bytes=3D117=A0=A0=A0 =A0=A0=A0 Combine input records=3D13
=A0=A0=A0 =A0=A0=A0 Com= bine output records=3D6
=A0=A0=A0 =A0=A0=A0 Reduce input groups=3D6
= =A0=A0=A0 =A0=A0=A0 Reduce shuffle bytes=3D112
=A0=A0=A0 =A0=A0=A0 Reduce input records=3D6
=A0=A0=A0 =A0=A0=A0 Reduce = output records=3D6
=A0=A0=A0 =A0=A0=A0 Spilled Records=3D12
=A0=A0=A0= =A0=A0=A0 Shuffled Maps =3D8
=A0=A0=A0 =A0=A0=A0 Failed Shuffles=3D0=A0=A0=A0 =A0=A0=A0 Merged Map outputs=3D8
=A0=A0=A0 =A0=A0=A0 GC time = elapsed (ms)=3D186
=A0=A0=A0 =A0=A0=A0 CPU time spent (ms)=3D8890
=A0=A0=A0 =A0=A0=A0 Physi= cal memory (bytes) snapshot=3D1408913408
=A0=A0=A0 =A0=A0=A0 Virtual mem= ory (bytes) snapshot=3D5727019008
=A0=A0=A0 =A0=A0=A0 Total committed he= ap usage (bytes)=3D1808990208
=A0=A0=A0 Shuffle Errors
=A0=A0=A0 =A0=A0=A0 BAD_ID=3D0
=A0=A0=A0 =A0=A0=A0 CONNECTION=3D0
=A0= =A0=A0 =A0=A0=A0 IO_ERROR=3D0
=A0=A0=A0 =A0=A0=A0 WRONG_LENGTH=3D0
= =A0=A0=A0 =A0=A0=A0 WRONG_MAP=3D0
=A0=A0=A0 =A0=A0=A0 WRONG_REDUCE=3D0=A0=A0=A0 File Input Format Counters
=A0=A0=A0 =A0=A0=A0 Bytes Read= =3D58
=A0=A0=A0 File Output Format Counters
=A0=A0=A0 =A0=A0=A0 Bytes Written=3D40

Thanks and Regards,
=
-Rahul Singh



--
Regards
Gordon Wa= ng




--
=
Regards
Gordon Wang
--047d7b673fb6d3e44104f6fbe981--