Return-Path: X-Original-To: apmail-hbase-user-archive@www.apache.org Delivered-To: apmail-hbase-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id D19C1106C7 for ; Sun, 9 Jun 2013 12:36:32 +0000 (UTC) Received: (qmail 78432 invoked by uid 500); 9 Jun 2013 12:36:29 -0000 Delivered-To: apmail-hbase-user-archive@hbase.apache.org Received: (qmail 78356 invoked by uid 500); 9 Jun 2013 12:36:22 -0000 Mailing-List: contact user-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hbase.apache.org Delivered-To: mailing list user@hbase.apache.org Received: (qmail 78343 invoked by uid 99); 9 Jun 2013 12:36:20 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 09 Jun 2013 12:36:20 +0000 X-ASF-Spam-Status: No, hits=-0.1 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_MED,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: local policy) Received: from [208.65.145.68] (HELO p01c12o145.mxlogic.net) (208.65.145.68) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 09 Jun 2013 12:36:14 +0000 Received: from unknown [192.118.48.248] by p01c12o145.mxlogic.net(mxl_mta-7.1.0-3) with SMTP id 8a674b15.0.23324.00-385.48599.p01c12o145.mxlogic.net (envelope-from ); Sun, 09 Jun 2013 06:35:54 -0600 (MDT) X-MXL-Hash: 51b476aa419e00d1-eb6adc50734969c2d70462ea0fab4c7302e9296d Received: from EXTLV.comverse.com ([10.115.242.219]) by il-ch1 ([10.115.242.29]) with mapi; Sun, 9 Jun 2013 15:35:51 +0300 From: Levy Meny To: "user@hbase.apache.org" CC: Yaniv Ofer Date: Sun, 9 Jun 2013 15:35:50 +0300 Subject: Compression class loading mismatch in 0.94.2 Thread-Topic: Compression class loading mismatch in 0.94.2 Thread-Index: Ac5lDeAZEY7v/6YVQoqDdss1dUxfGg== Message-ID: <85C7265F443C1249837736D58834E9E0CD4BD6F1CC@EXTLV.comverse.com> Accept-Language: en-US Content-Language: en-US X-MS-Has-Attach: X-MS-TNEF-Correlator: acceptlanguage: en-US Content-Type: multipart/alternative; boundary="_000_85C7265F443C1249837736D58834E9E0CD4BD6F1CCEXTLVcomverse_" MIME-Version: 1.0 X-AnalysisOut: [v=2.0 cv=SYpAgItu c=1 sm=1 a=tGSS/hp9udrLgJ531c5kuA==:17 a] X-AnalysisOut: [=KaIMEA6E63kA:10 a=_qO5nkWx604A:10 a=BLceEmwcHowA:10 a=E0s] X-AnalysisOut: [fkDetAAAA:8 a=8vOBVM_xg00A:10 a=COfzQ7OkAAAA:8 a=EGRst0bps] X-AnalysisOut: [8Prr0e7rFUA:9 a=CjuIK1q_8ugA:10 a=LfQ5QxKbGaoA:10 a=OOmJI5] X-AnalysisOut: [gby4wA:10 a=dgOuuvj1nqsA:10 a=yMhMjlubAAAA:8 a=SSmOFEACAAA] X-AnalysisOut: [A:8 a=4tDzhI-MWEeJCUYR73sA:9 a=gKO2Hq4RSVkA:10 a=UiCQ7L4-1] X-AnalysisOut: [S4A:10 a=hTZeC7Yk6K0A:10 a=frz4AuCg-hUA:10 a=aRN5EXcMPJwvC] X-AnalysisOut: [yXU:21] X-Spam: [F=0.2000000000; CM=0.500; S=0.200(2010122901)] X-MAIL-FROM: X-SOURCE-IP: [192.118.48.248] X-Virus-Checked: Checked by ClamAV on apache.org --_000_85C7265F443C1249837736D58834E9E0CD4BD6F1CCEXTLVcomverse_ Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: quoted-printable Hi, Anyone knows why org.apache.hadoop.hbase.io.hfile.Compression has changed i= n 0.94.2 to use the SystemClassLoader to load the snappy class, instead of = the ContextClassLoader in previous versions (e.g. in 0.92.1)? private CompressionCodec buildCodec(Configuration conf) { try { Class externalCodec =3D ClassLoader.getSystemClassLoader().loa= dClass("org.apache.hadoop.io.compress.SnappyCodec"); return (CompressionCodec) ReflectionUtils.newInstance(externalCod= ec,conf); (btw you will notice that ContextClassLoader is still used for loading e.g.= Lz4Codec) The error I got: 2013-05-31 00:01:25,704 [ERROR] [BulkImportManager-2-thread-1] org.apache.h= adoop.hbase.mapreduce.LoadIncrementalHFiles (LoadIncrementalHFiles.java:343= ) - Unexpected execution exception during splitting java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.l= ang.ClassNotFoundException: org.apache.hadoop.io.compress.SnappyCodec at java.util.concurrent.FutureTask$Sync.innerGet(Unknown Source) at java.util.concurrent.FutureTask.get(Unknown Source) at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.groupOrS= plitPhase(LoadIncrementalHFiles.java:333) at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.doBulkLo= ad(LoadIncrementalHFiles.java:232) ... Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: or= g.apache.hadoop.io.compress.SnappyCodec at org.apache.hadoop.hbase.io.hfile.Compression$Algorithm$4.buildCo= dec(Compression.java:207) at org.apache.hadoop.hbase.io.hfile.Compression$Algorithm$4.getCode= c(Compression.java:192) at org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.getCompre= ssor(Compression.java:302) at org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.(HFileB= lock.java:745) at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileW= riterV2.java:134) at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.(HFileWrite= rV2.java:125) at org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.c= reateWriter(HFileWriterV2.java:105) at org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFil= e.java:394) at org.apache.hadoop.hbase.regionserver.StoreFile$Writer.(Sto= reFile.java:1003) at org.apache.hadoop.hbase.regionserver.StoreFile$Writer.(Sto= reFile.java:948) at org.apache.hadoop.hbase.regionserver.StoreFile$WriterBuilder.bui= ld(StoreFile.java:851) at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.copyHFil= eHalf(LoadIncrementalHFiles.java:541) at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.splitSto= reFile(LoadIncrementalHFiles.java:514) at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.splitSto= reFile(LoadIncrementalHFiles.java:375) at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.groupOrS= plit(LoadIncrementalHFiles.java:439) at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles$2.call(Lo= adIncrementalHFiles.java:323) at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles$2.call(L= oadIncrementalHFiles.java:321) at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source) at java.util.concurrent.FutureTask.run(Unknown Source) ... 3 more Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.io.compress.= SnappyCodec at java.net.URLClassLoader$1.run(Unknown Source) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(Unknown Source) at java.lang.ClassLoader.loadClass(Unknown Source) at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source) at java.lang.ClassLoader.loadClass(Unknown Source) at org.apache.hadoop.hbase.io.hfile.Compression$Algorithm$4.buildCo= dec(Compression.java:201) ... 21 more This change make it impossible for me to bulk upload files into HBase from = my app when my app runs as a Tomcat web application since Tomcat does not a= llow setting the System class loader and all HBase and Hadoop jars are in m= y WEB-INF/lib and are not available to the System class loader.. Thanks, Meny Levy R&D T +972-3-7663350 M +972-52-8543350 Meny.Levy@comverse.com www.comverse.com ________________________________ "This e-mail message may contain confidential, commercial or privileged inf= ormation that constitutes proprietary information of Comverse Technology or= its subsidiaries. If you are not the intended recipient of this message, y= ou are hereby notified that any review, use or distribution of this informa= tion is absolutely prohibited and we request that you delete all copies and= contact us by e-mailing to: security@comverse.com. Thank You." --_000_85C7265F443C1249837736D58834E9E0CD4BD6F1CCEXTLVcomverse_--