Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id AB55118957 for ; Wed, 28 Oct 2015 20:33:31 +0000 (UTC) Received: (qmail 73201 invoked by uid 500); 28 Oct 2015 20:33:26 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 73096 invoked by uid 500); 28 Oct 2015 20:33:26 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 73086 invoked by uid 99); 28 Oct 2015 20:33:26 -0000 Received: from Unknown (HELO spamd2-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 28 Oct 2015 20:33:26 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd2-us-west.apache.org (ASF Mail Server at spamd2-us-west.apache.org) with ESMTP id 328541A2A14 for ; Wed, 28 Oct 2015 20:33:26 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd2-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 2.89 X-Spam-Level: ** X-Spam-Status: No, score=2.89 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, HTML_MESSAGE=3, SPF_PASS=-0.001, T_RP_MATCHES_RCVD=-0.01, URIBL_BLOCKED=0.001] autolearn=disabled Authentication-Results: spamd2-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=yahoo.com Received: from mx1-us-east.apache.org ([10.40.0.8]) by localhost (spamd2-us-west.apache.org [10.40.0.9]) (amavisd-new, port 10024) with ESMTP id edehQyRGNXFe for ; Wed, 28 Oct 2015 20:33:15 +0000 (UTC) Received: from nm11.bullet.mail.ne1.yahoo.com (nm11.bullet.mail.ne1.yahoo.com [98.138.90.74]) by mx1-us-east.apache.org (ASF Mail Server at mx1-us-east.apache.org) with ESMTPS id E1982439C6 for ; Wed, 28 Oct 2015 20:33:14 +0000 (UTC) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=yahoo.com; s=s2048; t=1446064388; bh=u8guM2setM+LoR3nKCrSyEC7TK49GbgNp2elJqCvA88=; h=Date:From:Reply-To:To:In-Reply-To:References:Subject:From:Subject; b=o0y3YGEq4k7GmSHzkB4GS5N7SUt+g3EhRFCu+83y05p3KhyZBpsifqhp/Xsjbr2JWTMa/bXi0I7YiVhz/UQHZg0JwYdP353EI24wiLfl357VOB5BVDzUJ9Q9QDT1DWYOOmxH+dIHeuyyXw2j5U3YckA7iFpsARss9+e8YZWtNrU8sMC+SJU5+jM0UerT6apoGOXurI693iIJDfVq7vUqof/hqtF2IyGqSFp0P3IppJjaHcx1+wmoE68IyNUrUgQP9FDjzphaYs0JVQhfkQP5A1DxSiYmVScdu3pxwzomRZpatATSMYVEN8SjQufY3o3iwsye88TiEph/mduCYzWX7A== Received: from [98.138.101.129] by nm11.bullet.mail.ne1.yahoo.com with NNFMP; 28 Oct 2015 20:33:08 -0000 Received: from [98.138.87.5] by tm17.bullet.mail.ne1.yahoo.com with NNFMP; 28 Oct 2015 20:33:08 -0000 Received: from [127.0.0.1] by omp1005.mail.ne1.yahoo.com with NNFMP; 28 Oct 2015 20:33:08 -0000 X-Yahoo-Newman-Property: ymail-3 X-Yahoo-Newman-Id: 467242.31277.bm@omp1005.mail.ne1.yahoo.com X-YMail-OSG: jj7fVm4VM1mazQnirO2QoUpvcLU57BhQatuHA2OW1u4Vx6CbxvYSPWHuwD0EtIG L_4GtPz9zcBITD8nq2q.6D5iw03vb4ep7YXGAZ7U0FZFdNYZs3bRaeCgioKrQ7.PftjOANFARFok 7co7.yRee2.hyD2v3t7hmQhij1PfIeQzxp1zlj1f6FfvoNI.T1FTTRClsRURnzTrXkAPMuXyhWgA qY5VdmQBsmEXB91u.w2avMDIHxfxT19odQrm8lM96ZAPotHtDJtd4YmYCJgHQetSdcmnfg.B3Sc7 sKcTSQRf4zpBpANBG6034K3w.fuCfHG5mo6PL91x1g8FpFcheV3aYjqid40K5oYkCU52Bfjg84o4 RZRQgZA4U_gEFj.TOCh599DSVOwPg39CM8w.f0zJD7Mxapm2B5lF3tar3685.KaoxISWjMH2LigG K0BNhGwNpQ5MNmQ4RDFnAIvEE3rMpAH2GpguKDGLzuOSo_6V24gZCZGHhw.CU81w95APRj9f0z7k 7To3eVfN_tIhZoMY- Received: by 98.138.105.227; Wed, 28 Oct 2015 20:33:08 +0000 Date: Wed, 28 Oct 2015 20:33:07 +0000 (UTC) From: Kiru Pakkirisamy Reply-To: Kiru Pakkirisamy To: "user@hadoop.apache.org" Message-ID: <210149576.346328.1446064387598.JavaMail.yahoo@mail.yahoo.com> In-Reply-To: References: Subject: Re: lzo error while running mr job MIME-Version: 1.0 Content-Type: multipart/alternative; boundary="----=_Part_346327_1679007585.1446064387585" ------=_Part_346327_1679007585.1446064387585 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Harish,Thank you very much for your valuable/assertive suggestion :-)I was = able to identify the problem and fix it.Else where in the code, we were set= ting a different mapred-site.xml in the configuration.I still do not know w= hy it is using the DefaultCodec for compression (instead of the one I set -= SnappyCodec), but I am hopeful I will get there. Thanks again.=C2=A0Regard= s,=C2=A0- kiru From: Harsh J To: "user@hadoop.apache.org" =20 Sent: Tuesday, October 27, 2015 8:34 AM Subject: Re: lzo error while running mr job =20 The stack trace is pretty certain you do, as it clearly tries to load a cla= ss not belonging within Apache Hadoop. Try looking at the XML files the app= lication uses? Perhaps you've missed all the spots. If I had to guess, given the JobSubmitter entry in the trace, it'd be in th= e submitting host's /etc/hadoop/conf/* files, or in the dir pointed by $HAD= OOP_CONF_DIR (if thats specifically set). Alternatively, it'd be in the cod= e. If you have control over the code, you can also make it dump the XML before= submit via: job.getConfiguration().writeXml(System.out);. The XML dump wil= l carry the source of all properties along with their value. On Tue, Oct 27, 2015 at 8:52 PM Kiru Pakkirisamy wrote: | Harish,We don't have lzo in the io.compression.codecs list.That is what i= s puzzling me.Regards,=C2=A0Kiru=C2=A0=20 | From:"Harsh J" Date:Mon, Oct 26, 2015 at 11:39 PM Subject:Re: lzo error while running mr job | | |=20 | Every codec in the io.compression.codecs list of classes will be initial= ised, regardless of actual further use. Since the Lzo*Codec classes require= the native library to initialise, the failure is therefore expected. On Tue, Oct 27, 2015 at 11:42 AM Kiru Pakkirisamy wrote: I am seeing a weird error after we moved to the new hadoop mapreduce java p= ackages in 2.4We are not using lzo (as in io.compression.codecs) but we sti= ll get this error. Does it mean we have to have lzo installed even though w= e are not using ? Thanks. Regards,- kiru 2015-10-27 00:18:57,994 ERROR com.hadoop.compression.lzo.GPLNativeCodeLoade= r | Could not load native gpl libraryjava.lang.UnsatisfiedLinkError: no gpl= compression in java.library.path at java.lang.ClassLoader.loadLibrary(Class= Loader.java:1886) ~[?:1.7.0_85] at java.lang.Runtime.loadLibrary0(Runtime.j= ava:849) ~[?:1.7.0_85] at java.lang.System.loadLibrary(System.java:1088) ~[= ?:1.7.0_85] at com.hadoop.compression.lzo.GPLNativeCodeLoader.(GPLN= ativeCodeLoader.java:31) [flow-trunk.242-470787.jar:?] at com.hadoop.compre= ssion.lzo.LzoCodec.(LzoCodec.java:60) [flow-trunk.242-470787.jar:?]= at java.lang.Class.forName0(Native Method) [?:1.7.0_85] at java.lang.Class= .forName(Class.java:278) [?:1.7.0_85] at org.apache.hadoop.conf.Configurati= on.getClassByNameOrNull(Configuration.java:1834) [flow-trunk.242-470787.jar= :?] at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.ja= va:1799) [flow-trunk.242-470787.jar:?] at org.apache.hadoop.io.compress.Com= pressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:128) [flo= w-trunk.242-470787.jar:?] at org.apache.hadoop.io.compress.CompressionCodec= Factory.(CompressionCodecFactory.java:175) [flow-trunk.242-470787.jar= :?] at org.apache.hadoop.mapreduce.lib.input.CombineFileInputFormat.isSplit= able(CombineFileInputFormat.java:159) [flow-trunk.242-470787.jar:?] at org.= apache.hadoop.mapreduce.lib.input.CombineFileInputFormat.getMoreSplits(Comb= ineFileInputFormat.java:283) [flow-trunk.242-470787.jar:?] at org.apache.ha= doop.mapreduce.lib.input.CombineFileInputFormat.getSplits(CombineFileInputF= ormat.java:243) [flow-trunk.242-470787.jar:?] at org.apache.hadoop.mapreduc= e.JobSubmitter.writeNewSplits(JobSubmitter.java:493) [flow-trunk.242-470787= .jar:?] =C2=A0Regards,=C2=A0- kiru | | ------=_Part_346327_1679007585.1446064387585 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Harish,
Thank you v= ery much for your valuable/assertive suggestion :-)
I was able to identify the problem and fix it.
Else where in the code, we were setting a di= fferent mapred-site.xml in the configuration.
I still do not know why it is using the Def= aultCodec for compression (instead of the one I set - SnappyCodec), but I a= m hopeful I will get there. Thanks again.
 
Regards,
 - kiru

<= div style=3D"font-family: HelveticaNeue, Helvetica Neue, Helvetica, Arial, = Lucida Grande, sans-serif; font-size: 16px;" id=3D"yui_3_16_0_1_14460641756= 16_3964">

From: Harsh J <har= sh@cloudera.com>
To: "user@hadoop.apache.org" <user@hadoop.apache.org>
Sent: Tuesday, October 27, 2015 8:34 A= M
Subject: Re: lzo err= or while running mr job

The stack trace is pretty certain you do, as it clearly t= ries to load a class not belonging within Apache Hadoop. Try looking at the= XML files the application uses? Perhaps you've missed all the spots.

If I had to guess, given the JobSubmitter e= ntry in the trace, it'd be in the submitting host's /etc/hadoop/conf/* file= s, or in the dir pointed by $HADOOP_CONF_DIR (if thats specifically set). A= lternatively, it'd be in the code.

If you have control over the code, you can also make it dump the XML bef= ore submit via: job.getConfiguration().writeXml(System.out);. The XML dump = will carry the source of all properties along with their value.
=


On Tue, Oct 27, 2015 at 8:52 PM = Kiru Pakkirisamy <kirupakkirisamy@yahoo.com> wrote:
Harish,
We don't have lzo in the io.compression.co= decs list.
That is what is puzzling me.
Regards, <= /div>
Kiru 

From:"Harsh= J" <harsh@cloudera.c= om>
Date:Mon, Oct 26, 2015 at 11:39 PM
Subject:Re: lzo error while running mr job

=
Every codec in the io.compression.codecs list= of classes will be initialised, regardless of actual further use. Since th= e Lzo*Codec classes require the native library to initialise, the failure is therefore expected.

On Tue, Oct 27, = 2015 at 11:42 AM Kiru Pakkirisamy <kirupakkirisamy@yahoo.com> wrote:
= Regards,
- kiru

2015-10-27 00:18:57,994 ERROR com.hadoop.comp= ression.lzo.GPLNativeCodeLoader | Could not load native gpl library<= /div>
java.lang.UnsatisfiedLinkError: no gplcompression in java.l= ibrary.path
=09at java.lang.ClassLoader= .loadLibrary(ClassLoader.java:1886) ~[?:1.7.0_85]
=09at java.lang.Runtime.loadLibrary0(Runtime.java:849) ~[?:1.7.0= _85]
=09at java.lang.System.loadLibrary= (System.java:1088) ~[?:1.7.0_85]
=09at = com.hadoop.compression.lzo.GPLNativeCodeLoader.<clinit>(GPLNativeCode= Loader.java:31) [flow-trunk.242-470787.jar:?]
= =09at com.hadoop.compression.lzo.LzoCodec.<clinit>(LzoCodec.ja= va:60) [flow-trunk.242-470787.jar:?]
=09at ja= va.lang.Class.forName0(Native Method) [?:1.7.0_85]
<= span>=09at java.lang.Class.forName(Class.java:278) [?:1.7.0_85]
=09at org.apache.hadoop.conf.Configuration.= getClassByNameOrNull(Configuration.java:1834) [flow-trunk.242-470787.jar:?]=
=09at org.apache.hadoop.conf.Configura= tion.getClassByName(Configuration.java:1799) [flow-trunk.242-470787.jar:?]<= /span>
=09at org.apache.hadoop.io.compress.Com= pressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:128) [flo= w-trunk.242-470787.jar:?]
=09at org.apa= che.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionCode= cFactory.java:175) [flow-trunk.242-470787.jar:?]
=09at org.apache.hadoop.mapreduce.lib.input.CombineFileInputFormat.isSplitable(C= ombineFileInputFormat.java:159) [flow-trunk.242-470787.jar:?]
<= div>=09at org.apache.hadoop.mapreduce.lib.input.CombineF= ileInputFormat.getMoreSplits(CombineFileInputFormat.java:283) [flow-trunk.2= 42-470787.jar:?]
=09at org.apache.hadoo= p.mapreduce.lib.input.CombineFileInputFormat.getSplits(CombineFileInputForm= at.java:243) [flow-trunk.242-470787.jar:?]
=09at org.apache.hadoop.mapreduce= .JobSubmitter.writeNewSplits(JobSubmitter.java:493) [flow-trunk.242-470787.= jar:?]

<= div>
 
Regards,
 - kiru
=


=
------=_Part_346327_1679007585.1446064387585--