Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 2BB41182AF for ; Fri, 26 Jun 2015 16:44:39 +0000 (UTC) Received: (qmail 17721 invoked by uid 500); 26 Jun 2015 16:44:32 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 17626 invoked by uid 500); 26 Jun 2015 16:44:32 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 17616 invoked by uid 99); 26 Jun 2015 16:44:31 -0000 Received: from Unknown (HELO spamd4-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 26 Jun 2015 16:44:31 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd4-us-west.apache.org (ASF Mail Server at spamd4-us-west.apache.org) with ESMTP id 639B7C0338 for ; Fri, 26 Jun 2015 16:44:31 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd4-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 2.88 X-Spam-Level: ** X-Spam-Status: No, score=2.88 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, HTML_MESSAGE=3, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, SPF_PASS=-0.001, URIBL_BLOCKED=0.001] autolearn=disabled Authentication-Results: spamd4-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-eu-west.apache.org ([10.40.0.8]) by localhost (spamd4-us-west.apache.org [10.40.0.11]) (amavisd-new, port 10024) with ESMTP id D5dPQjrBdtLw for ; Fri, 26 Jun 2015 16:44:23 +0000 (UTC) Received: from mail-lb0-f181.google.com (mail-lb0-f181.google.com [209.85.217.181]) by mx1-eu-west.apache.org (ASF Mail Server at mx1-eu-west.apache.org) with ESMTPS id 07DE820611 for ; Fri, 26 Jun 2015 16:44:23 +0000 (UTC) Received: by lbnk3 with SMTP id k3so67948195lbn.1 for ; Fri, 26 Jun 2015 09:43:31 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :content-type; bh=DaoeB71zHV/XsDnmPkFhCV7sB43+fDfS6iN+JYKLKzk=; b=Lice+gZ+K0L99/cFZ65I7zFJYHJWRKrOzJFMMfwW5L369vqLWCej0vrD4wwzFFFVYM nzXOKgz/a6+hL+vED/1tcCjO7U+Lq8fjNRNc6FAMSffx/aRvx0dxPu1OoiuLts85xJ5A XFDArqw456+IgmCT0l91pSPFNA8UPfzi/XdBMcp1chPj9ytnLSUesY08D74PQPVToBTb t0m00T/HIHbVMrdzz1y5eYbcOWAFkxspfLpxV19FIRFmDUYGHTOXNM0iro+uj4SM2y8V Ua/mwx9qhmVoHsI3if9qrirTmc71hSjD2EvE0abFXAm3EILgqFViXgNgvGwxLjNyBFou m9cg== X-Received: by 10.112.24.233 with SMTP id x9mr2447000lbf.7.1435337011296; Fri, 26 Jun 2015 09:43:31 -0700 (PDT) MIME-Version: 1.0 Received: by 10.112.87.168 with HTTP; Fri, 26 Jun 2015 09:42:51 -0700 (PDT) In-Reply-To: <558D7A3A.9020100@gmail.com> References: <558D7A3A.9020100@gmail.com> From: Dieter De Witte Date: Fri, 26 Jun 2015 18:42:51 +0200 Message-ID: Subject: Re: Invalid key type in the map task To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=001a1134dbc6b104d705196e708d --001a1134dbc6b104d705196e708d Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable You need to show the driver class as well. Are you using textinputformat? Are you aware that this standard inputformat will take as a value the complete line (until newline separator), the key in that case is the bitoffset in the file and definitely not the number you assume it will be. 2015-06-26 18:13 GMT+02:00 xeonmailinglist-gmail : > Hi, > > I have this map class that is accepting input files with a key as > LongWritable and a value of Text. > > The input file is in [1]. Here we can see that it contains a key as a Lon= g > (I think) and bytes as value. > In [2], it is my map class. The goal of the map class is to read the inpu= t > data, and sent it out as it is. > I was expecting that the key was the index number, and the value was the > gibberish. But, I am having a strange error [3], as the key is getting > there a s Text, and not the index. > > Can anyone tell me why I get this error, and how I can solve it? > > [1] Input file > > xubuntu@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -cat /output1-1434970707= /part-m-00000 > 15/06/26 12:01:35 WARN util.NativeCodeLoader: Unable to load native-hadoo= p library for your platform... using builtin-java classes where applicable > 0 SEQ "org.apache.hadoop.io.BytesWritable"org.apache.hadoop.io.BytesWr= itableI=EF=BF=BD=EF=BF=BD=EF=BF=BD#HA=EF=BF=BD=EF=BF=BDu=EF=BF=BD=EF=BF=BDB= Z =EF=BF=BD =EF=BF=BD n=EF=BF=BD=EF=BF=BD tX eX=EF=BF=BD=EF=BF=BD=EF=BF=BD= =EF=BF=BDzZi=EF=BF=BD=EF=BF=BDn dFw=EF=BF=BD{>=EF=BF=BDH=EF=BF=BD \=EF=BF= =BD =EF=BF=BD=EF=BF=BDBi=EF=BF=BD'=EF=BF=BDg b=EF=BF=BDY=EF=BF=BDa =EF=BF= =BD*4=EF=BF=BD7;=EF=BF=BD=EF=BF=BD=EF=BF=BDuL_=EF=BF=BD=EF=BF=BDgs=EF=BF=BD= =EF=BF=BD =EF=BF=BD`=EF=BF=BD=EF=BF=BD * > 187 =EF=BF=BD=EF=BF=BD=EF=BF=BD=EF=BF=BD tAj#=EF=BF=BD5*nX=EF=BF=BDb "= =EF=BF=BD=EF=BF=BD>=EF=BF=BD=CB=A5k=EF=BF=BDc2=EF=BF=BD)=EF=BF=BD1z=EF=BF= =BD1=EF=BF=BD =EF=BF=BD=EF=BF=BD6=EF=BF=BDl=EF=BF=BD=EF=BF=BD=EF=BF=BD Q = =EF=BF=BDbe3'$ =EF=BF=BDG=EF=BF=BD=EF=BF=BD!{=EF=BF=BD =EF=BF=BD u=EF=BF= =BD=EF=BF=BD=EF=BF=BD=EF=BF=BDz@=EF=BF=BDt=EF=BF=BD=EF=BF=BD=EF=BF=BD=EF=BF= =BDv=EF=BF=BD=EF=BF=BD=EF=BF=BDr=EF=BF=BDgj8fq=EF=BF=BDP=EF=BF=BD=EF=BF=BD.= =EF=BF=BDm =EF=BF=BD=EF=BF=BD=EF=BF=BD y =EF=BF=BD=EF=BF=BDn=EF=BF=BD=EF=BF= =BD!=EF=BF=BD=EF=BF=BD=D6=8C=EF=BF=BD_w=EF=BF=BDU8e=EF=BF=BD=EF=BF=BD=EF=BF= =BD=EF=BF=BD=EF=BF=BDv0=EF=BF=BD=EF=BF=BD=EF=BF=BDS=EF=BF=BD=EF=BF=BD=EF=BF= =BD2c=EF=BF=BD_=EF=BF=BD=D5=BB=EF=BF=BDR\x *=EF=BF=BD=EF=BF=BD > = = =EF=BF=BD=DC=81=EF=BF=BD=EF=BF=BD=EF=BF=BD|x > = = U=EF=BF=BD3=EF=BF=BD=EF=BF=BD=EF=BF=BDl=EF=BF=BD=EF=BF= =BD=EF=BF=BDqa q=EF=BF=BD )=EF=BF=BDN =EF=BF=BD=EF=BF=BDk=EF=BF=BDL=EF=BF= =BD=EF=BF=BDa=EF=BF=BD2=EF=BF=BD=D5=B6=EF=BF=BD=EF=BF=BD=EF=BF=BDk NL > 404 =EF=BF=BD=EF=BF=BD)i =EF=BF=BDg A=EF=BF=BD2 )L=EF=BF=BD=EF=BF= =BDW,=EF=BF=BD=EF=BF=BD=EF=BF=BD=DC=BA=EF=BF=BDV=EF=BF=BDw=EF=BF=BD =EF=BF= =BD=EE=B4=81M7=DA=ADC=CF=A3=EF=BF=BD=EF=BF=BD=EF=BF=BD=EF=BF=BDZI=EF=BF=BD= =EF=BF=BD be=EF=BF=BD=EF=BF=BD$=EF=BF=BDf [=EF=BF=BD=EF=BF=BD =EF=BF=BD W= =EF=BF=BD 7=EF=BF=BD=EF=BF=BD5=EF=BF=BD6.}=EF=BF=BD=D1=BD b=EF=BF=BD = =EF=BF=BD=EF=BF=BD=EF=BF=BD=EF=BF=BDC=EF=BF=BD=EF=BF=BD%|=EF=BF=BDh=EF=BF= =BD=C7=89=EF=BF=BDD } @ F=EF=BF=BD"=EF=BF=BD(K =EF=BF=BD=EF=BF=BDxh=EF=BF= =BD'6 > > [2] My map class > > > /** Identity mapper set by the user. */ > public static class MyFullyIndentityMapperWebdataScan > extends Mapper{ > > // private LongWritable word =3D new LongWritable(); > private LongWritable word =3D new LongWritable(); > private Text rvalue =3D new Text(); > > public void map(LongWritable key, Text value, Context context > ) throws IOException, InterruptedException { > > System.out.println("1: " + key.getClass().getName() + " " + v= alue.getClass().getName()); > System.out.println("2: " + context.getCurrentKey().getClass()= + " " + context.getCurrentValue().getClass()); > context.write(key, value); > } > } > > [3] Output of execution > > > Log Type: stdout > > Log Upload Time: 26-Jun-2015 11:58:53 > > Log Length: 138 > > 1: org.apache.hadoop.io.LongWritable org.apache.hadoop.io.Text 2: class o= rg.apache.hadoop.io.LongWritable class org.apache.hadoop.io.Text > > > > Log Type: syslog > > Log Upload Time: 26-Jun-2015 11:58:53 > > Log Length: 4367 > > Showing 4096 bytes of 4367 total. Click here = for the full log. > > operties from hadoop-metrics2.properties 2015-06-26 11:58:32,118 INFO [ma= in] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot p= eriod at 10 second(s). 2015-06-26 11:58:32,118 INFO [main] org.apache.hadoo= p.metrics2.impl.MetricsSystemImpl: MapTask metrics system started 2015-06-2= 6 11:58:32,128 INFO [main] org.apache.hadoop.mapred.YarnChild: Executing wi= th tokens: 2015-06-26 11:58:32,128 INFO [main] org.apache.hadoop.mapred.Yar= nChild: Kind: mapreduce.job, Service: job_1435332416394_0009, Ident: (org.a= pache.hadoop.mapreduce.security.token.JobTokenIdentifier@5d7657) 2015-06-26= 11:58:32,208 INFO [main] org.apache.hadoop.mapred.YarnChild: Sleeping for = 0ms before retrying again. Got null now. 2015-06-26 11:58:32,439 INFO [main= ] org.apache.hadoop.mapred.YarnChild: mapreduce.cluster.local.dir for child= : /tmp/hadoop-temp/nm-local-dir/usercache/xubuntu/appcache/application_1435= 332416394_0009 2015-06-26 11:58:32,611 INFO [main] org.apache.hadoop.conf.C= onfiguration.deprecation: > session.id is deprecated. Instead, use dfs.metrics.session-id 2015-06-2= 6 11:58:33,038 INFO [main] org.apache.hadoop.mapred.Task: Using ResourceCa= lculatorProcessTree : [ ] 2015-06-26 11:58:33,180 INFO [main] org.apache.ha= doop.mapred.MapTask: Processing split: hdfs://hadoop-coc-1:9000/output1-143= 4970707/part-m-00000:0+17853 2015-06-26 11:58:33,252 INFO [main] org.apache= .hadoop.mapred.MapTask: (EQUATOR) 0 kvi 26214396(104857584) 2015-06-26 11:5= 8:33,252 INFO [main] org.apache.hadoop.mapred.MapTask: mapreduce.task.io.so= rt.mb: 100 2015-06-26 11:58:33,252 INFO [main] org.apache.hadoop.mapred.Map= Task: soft limit at 83886080 2015-06-26 11:58:33,252 INFO [main] org.apache= .hadoop.mapred.MapTask: bufstart =3D 0; bufvoid =3D 104857600 2015-06-26 11= :58:33,252 INFO [main] org.apache.hadoop.mapred.MapTask: kvstart =3D 262143= 96; length =3D 6553600 2015-06-26 11:58:33,255 INFO [main] org.apache.hadoo= p.mapred.MapTask: Map output collector class =3D org.apache.hadoop.mapred.M= apTask$MapOutputBuffer 2015-06-26 11: > 5 > 8:33,269 INFO [main] org.apache.hadoop.mapred.MapTask: Starting flush of = map output 2015-06-26 11:58:33,276 WARN [main] org.apache.hadoop.mapred.Yar= nChild: Exception running child : java.io.IOException: Type mismatch in key= from map: expected org.apache.hadoop.io.Text, received org.apache.hadoop.i= o.LongWritable at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.coll= ect(MapTask.java:1069) at org.apache.hadoop.mapred.MapTask$NewOutputCol= lector.write(MapTask.java:712) at org.apache.hadoop.mapreduce.task.Task= InputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org= .apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.= java:112) at org.apache.hadoop.mapred.examples.MyWebDataScan$MyFullyInd= entityMapperWebdataScan.map(MyWebDataScan.java:144) at org.apache.hadoo= p.mapred.examples.MyWebDataScan$MyFullyIndentityMapperWebdataScan.map(MyWeb= DataScan.java:131) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.jav= a:145) at org.apache. > h > adoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.had= oop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.Ya= rnChild$2.run(YarnChild.java:163) at java.security.AccessController.doP= rivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.ja= va:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGro= upInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(Yar= nChild.java:158) > > =E2=80=8B > > -- > -- > Thanks, > > --001a1134dbc6b104d705196e708d Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
You need to show the driver class as well. Are you using t= extinputformat? Are you aware that this standard inputformat will take as a= value the complete line (until newline separator), the key in that case is= the bitoffset in the file and definitely not the number you assume it will= be.



2015-06-26 18:13 GMT+02:00 xeonmailinglist-gmail <= ;xeonmailing= list@gmail.com>:
=20 =20 =20

Hi,

I have this map class that is accepting input files with a key as LongWritable and a value of Text.

The input file is in [1]. Here we can see that it contains a key as a Long (I think) and bytes as value.
In [2], it is my map class. The goal of the map class is to read the input data, and sent it out as it is.
I was expecting that the key was the index number, and the value was the gibberish. But, I am having a strange error [3], as the key is getting there a s Text, and not the index.

Can anyone tell me why I get this error, and how I can solve it?

[1] Input file

xubuntu@hadoo=
p-coc-1:~/Programs/hadoop$ hdfs dfs -cat /output1-1434970707/part-m-00000
15/06/26 12:01:35 WARN util.NativeCodeLoader: Unable to load native-hadoop =
library for your platform... using builtin-java classes where applicable
0    SEQ "org.apache.hadoop.io.BytesWritable"org.apache.hadoop.io=
.BytesWritableI=EF=BF=BD=EF=BF=BD=EF=BF=BD#HA=EF=BF=BD=EF=BF=BDu=EF=BF=BD=
=EF=BF=BDBZ =EF=BF=BD =EF=BF=BD n=EF=BF=BD=EF=BF=BD tX eX=EF=BF=BD=EF=BF=BD=
=EF=BF=BD=EF=BF=BDzZi=EF=BF=BD=EF=BF=BDn dFw=EF=BF=BD{>=EF=BF=BDH=EF=BF=
=BD \=EF=BF=BD =EF=BF=BD=EF=BF=BDBi=EF=BF=BD'=EF=BF=BDg b=EF=BF=BDY=EF=
=BF=BDa =EF=BF=BD*4=EF=BF=BD7;=EF=BF=BD=EF=BF=BD=EF=BF=BDuL_=EF=BF=BD=EF=BF=
=BDgs=EF=BF=BD=EF=BF=BD =EF=BF=BD`=EF=BF=BD=EF=BF=BD *
187    =EF=BF=BD=EF=BF=BD=EF=BF=BD=EF=BF=BD tAj#=EF=BF=BD5*nX=EF=BF=BDb &qu=
ot; =EF=BF=BD=EF=BF=BD>=EF=BF=BD=CB=A5k=EF=BF=BDc2=EF=BF=BD)=EF=BF=BD1z=
=EF=BF=BD1=EF=BF=BD =EF=BF=BD=EF=BF=BD6=EF=BF=BDl=EF=BF=BD=EF=BF=BD=EF=BF=
=BD Q  =EF=BF=BDbe3'$ =EF=BF=BDG=EF=BF=BD=EF=BF=BD!{=EF=BF=BD =EF=BF=BD=
  u=EF=BF=BD=EF=BF=BD=EF=BF=BD=EF=BF=BDz@=EF=BF=BDt=EF=BF=BD=EF=BF=BD=EF=BF=
=BD=EF=BF=BDv=EF=BF=BD=EF=BF=BD=EF=BF=BDr=EF=BF=BDgj8fq=EF=BF=BDP=EF=BF=BD=
=EF=BF=BD.=EF=BF=BDm =EF=BF=BD=EF=BF=BD=EF=BF=BD y =EF=BF=BD=EF=BF=BDn=EF=
=BF=BD=EF=BF=BD!=EF=BF=BD=EF=BF=BD=D6=8C=EF=BF=BD_w=EF=BF=BDU8e=EF=BF=BD=EF=
=BF=BD=EF=BF=BD=EF=BF=BD=EF=BF=BDv0=EF=BF=BD=EF=BF=BD=EF=BF=BDS=EF=BF=BD=EF=
=BF=BD=EF=BF=BD2c=EF=BF=BD_=EF=BF=BD=D5=BB=EF=BF=BDR\x *=EF=BF=BD=EF=BF=BD
                                                                           =
                                                                           =
          =EF=BF=BD=DC=81=EF=BF=BD=EF=BF=BD=EF=BF=BD|x
                                                                           =
                                                                           =
                 U=EF=BF=BD3=EF=BF=BD=EF=BF=BD=EF=BF=BDl=EF=BF=BD=EF=BF=BD=
=EF=BF=BDqa q=EF=BF=BD )=EF=BF=BDN =EF=BF=BD=EF=BF=BDk=EF=BF=BDL=EF=BF=BD=
=EF=BF=BDa=EF=BF=BD2=EF=BF=BD=D5=B6=EF=BF=BD=EF=BF=BD=EF=BF=BDk    NL
404    =EF=BF=BD=EF=BF=BD)i =EF=BF=BDg    A=EF=BF=BD2 )L=EF=BF=BD=EF=BF=BDW=
,=EF=BF=BD=EF=BF=BD=EF=BF=BD=DC=BA=EF=BF=BDV=EF=BF=BDw=EF=BF=BD =EF=BF=BD=
=EE=B4=81M7=DA=ADC=CF=A3=EF=BF=BD=EF=BF=BD=EF=BF=BD=EF=BF=BDZI=EF=BF=BD=EF=
=BF=BD be=EF=BF=BD=EF=BF=BD$=EF=BF=BDf [=EF=BF=BD=EF=BF=BD =EF=BF=BD W=EF=
=BF=BD 7=EF=BF=BD=EF=BF=BD5=EF=BF=BD6.}=EF=BF=BD=D1=BD  b=EF=BF=BD     =EF=
=BF=BD=EF=BF=BD=EF=BF=BD=EF=BF=BDC=EF=BF=BD=EF=BF=BD%|=EF=BF=BDh=EF=BF=BD=
=C7=89=EF=BF=BDD } @ F=EF=BF=BD"=EF=BF=BD(K =EF=BF=BD=EF=BF=BDxh=EF=BF=
=BD'6

[2] My map class


    /** Identity mapper set by the user. */
    public static class MyFullyIndentityMapperWebdataScan=20
    extends Mapper<LongWritable, Text, LongWritable, Text>{

        //        private LongWritable word =3D new LongWritable();
        private LongWritable word =3D new LongWritable();
        private Text rvalue =3D new Text();

        public void map(LongWritable key, Text value, Context context
                ) throws IOException, InterruptedException {

            System.out.println("1: " + key.getClass().getName() +=
 " " + value.getClass().getName());
            System.out.println("2: " + context.getCurrentKey().ge=
tClass() + " " + context.getCurrentValue().getClass());
            context.write(key, value);
        }
    }

[3] Output of execution


             Log Type: stdout          =20

             Log Upload Time: 26-Jun-2015 11:58:53          =20

             Log Length: 138          =20

1: org.apache.hadoop.io.LongWritable org.apache.hadoop.io.Text 2: class org=
.apache.hadoop.io.LongWritable class org.apache.hadoop.io.Text=20



             Log Type: syslog          =20

             Log Upload Time: 26-Jun-2015 11:58:53          =20

             Log Length: 4367          =20

             Showing 4096 bytes of 4367 total. Click              here     =
         for the full log.          =20

operties from hadoop-metrics2.properties 2015-06-26 11:58:32,118 INFO [main=
] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot per=
iod at 10 second(s). 2015-06-26 11:58:32,118 INFO [main] org.apache.hadoop.=
metrics2.impl.MetricsSystemImpl: MapTask metrics system started 2015-06-26 =
11:58:32,128 INFO [main] org.apache.hadoop.mapred.YarnChild: Executing with=
 tokens: 2015-06-26 11:58:32,128 INFO [main] org.apache.hadoop.mapred.YarnC=
hild: Kind: mapreduce.job, Service: job_1435332416394_0009, Ident: (org.apa=
che.hadoop.mapreduce.security.token.JobTokenIdentifier@5d7657) 2015-06-26 1=
1:58:32,208 INFO [main] org.apache.hadoop.mapred.YarnChild: Sleeping for 0m=
s before retrying again. Got null now. 2015-06-26 11:58:32,439 INFO [main] =
org.apache.hadoop.mapred.YarnChild: mapreduce.cluster.local.dir for child: =
/tmp/hadoop-temp/nm-local-dir/usercache/xubuntu/appcache/application_143533=
2416394_0009 2015-06-26 11:58:32,611 INFO [main] org.apache.hadoop.conf.Con=
figuration.deprecation:
 =20
session.id is deprecate=
d. Instead, use dfs.metrics.session-id 2015-06-26 11:58:33,038 INFO [main] =
org.apache.hadoop.mapred.Task:  Using ResourceCalculatorProcessTree : [ ] 2=
015-06-26 11:58:33,180 INFO [main] org.apache.hadoop.mapred.MapTask: Proces=
sing split: hdfs://hadoop-coc-1:9000/output1-1434970707/part-m-00000:0+1785=
3 2015-06-26 11:58:33,252 INFO [main] org.apache.hadoop.mapred.MapTask: (EQ=
UATOR) 0 kvi 26214396(104857584) 2015-06-26 11:58:33,252 INFO [main] org.ap=
ache.hadoop.mapred.MapTask: mapreduce.task.io.sort.mb: 100 2015-06-26 11:58=
:33,252 INFO [main] org.apache.hadoop.mapred.MapTask: soft limit at 8388608=
0 2015-06-26 11:58:33,252 INFO [main] org.apache.hadoop.mapred.MapTask: buf=
start =3D 0; bufvoid =3D 104857600 2015-06-26 11:58:33,252 INFO [main] org.=
apache.hadoop.mapred.MapTask: kvstart =3D 26214396; length =3D 6553600 2015=
-06-26 11:58:33,255 INFO [main] org.apache.hadoop.mapred.MapTask: Map outpu=
t collector class =3D org.apache.hadoop.mapred.MapTask$MapOutputBuffer 2015=
-06-26 11:
 5
8:33,269 INFO [main] org.apache.hadoop.mapred.MapTask: Starting flush of ma=
p output 2015-06-26 11:58:33,276 WARN [main] org.apache.hadoop.mapred.YarnC=
hild: Exception running child : java.io.IOException: Type mismatch in key f=
rom map: expected org.apache.hadoop.io.Text, received org.apache.hadoop.io.=
LongWritable     at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collec=
t(MapTask.java:1069)     at org.apache.hadoop.mapred.MapTask$NewOutputColle=
ctor.write(MapTask.java:712)     at org.apache.hadoop.mapreduce.task.TaskIn=
putOutputContextImpl.write(TaskInputOutputContextImpl.java:89)     at org.a=
pache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.ja=
va:112)     at org.apache.hadoop.mapred.examples.MyWebDataScan$MyFullyInden=
tityMapperWebdataScan.map(MyWebDataScan.java:144)     at org.apache.hadoop.=
mapred.examples.MyWebDataScan$MyFullyIndentityMapperWebdataScan.map(MyWebDa=
taScan.java:131)     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:=
145)     at org.apache.
 h
adoop.mapred.MapTask.runNewMapper(MapTask.java:784)     at org.apache.hadoo=
p.mapred.MapTask.run(MapTask.java:341)     at org.apache.hadoop.mapred.Yarn=
Child$2.run(YarnChild.java:163)     at java.security.AccessController.doPri=
vileged(Native Method)     at javax.security.auth.Subject.doAs(Subject.java=
:422)     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroup=
Information.java:1628)     at org.apache.hadoop.mapred.YarnChild.main(YarnC=
hild.java:158)
=E2=80=8B
--=20
--
Thanks,

--001a1134dbc6b104d705196e708d--