hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Levy Meny <Meny.L...@comverse.com>
Subject Compression class loading mismatch in 0.94.2
Date Sun, 09 Jun 2013 12:35:50 GMT
Hi,
Anyone knows why org.apache.hadoop.hbase.io.hfile.Compression has changed in 0.94.2 to use
the SystemClassLoader to load the snappy class, instead of the ContextClassLoader in previous
versions (e.g. in 0.92.1)?

      private CompressionCodec buildCodec(Configuration conf) {
        try {
          Class<?> externalCodec = ClassLoader.getSystemClassLoader().loadClass("org.apache.hadoop.io.compress.SnappyCodec");
          return (CompressionCodec) ReflectionUtils.newInstance(externalCodec,conf);

(btw you will notice that ContextClassLoader is still used for loading e.g. Lz4Codec)

The error I got:

2013-05-31 00:01:25,704 [ERROR] [BulkImportManager-2-thread-1] org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles
(LoadIncrementalHFiles.java:343) - Unexpected execution exception during splitting
java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.ClassNotFoundException:
org.apache.hadoop.io.compress.SnappyCodec
        at java.util.concurrent.FutureTask$Sync.innerGet(Unknown Source)
        at java.util.concurrent.FutureTask.get(Unknown Source)
        at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.groupOrSplitPhase(LoadIncrementalHFiles.java:333)
        at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.doBulkLoad(LoadIncrementalHFiles.java:232)
...
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: org.apache.hadoop.io.compress.SnappyCodec
        at org.apache.hadoop.hbase.io.hfile.Compression$Algorithm$4.buildCodec(Compression.java:207)
        at org.apache.hadoop.hbase.io.hfile.Compression$Algorithm$4.getCodec(Compression.java:192)
        at org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.getCompressor(Compression.java:302)
        at org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:745)
        at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:134)
        at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:125)
        at org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:105)
        at org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:394)
        at org.apache.hadoop.hbase.regionserver.StoreFile$Writer.<init>(StoreFile.java:1003)
        at org.apache.hadoop.hbase.regionserver.StoreFile$Writer.<init>(StoreFile.java:948)
        at org.apache.hadoop.hbase.regionserver.StoreFile$WriterBuilder.build(StoreFile.java:851)
        at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.copyHFileHalf(LoadIncrementalHFiles.java:541)
        at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.splitStoreFile(LoadIncrementalHFiles.java:514)
        at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.splitStoreFile(LoadIncrementalHFiles.java:375)
        at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.groupOrSplit(LoadIncrementalHFiles.java:439)
       at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles$2.call(LoadIncrementalHFiles.java:323)
        at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles$2.call(LoadIncrementalHFiles.java:321)
        at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
        at java.util.concurrent.FutureTask.run(Unknown Source)
        ... 3 more
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.io.compress.SnappyCodec
        at java.net.URLClassLoader$1.run(Unknown Source)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(Unknown Source)
        at java.lang.ClassLoader.loadClass(Unknown Source)
        at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
        at java.lang.ClassLoader.loadClass(Unknown Source)
        at org.apache.hadoop.hbase.io.hfile.Compression$Algorithm$4.buildCodec(Compression.java:201)
        ... 21 more

This change make it impossible for me to bulk upload files into HBase from my app when my
app runs as a Tomcat web application since Tomcat does not allow setting the System class
loader and all HBase and Hadoop jars are in my WEB-INF/lib and are not available to the System
class loader..

Thanks,

Meny Levy
R&D

T   +972-3-7663350
M +972-52-8543350
Meny.Levy@comverse.com <mailto:Meny.Levy@comverse.com>
www.comverse.com<http://www.comverse.com/>


________________________________
"This e-mail message may contain confidential, commercial or privileged information that constitutes
proprietary information of Comverse Technology or its subsidiaries. If you are not the intended
recipient of this message, you are hereby notified that any review, use or distribution of
this information is absolutely prohibited and we request that you delete all copies and contact
us by e-mailing to: security@comverse.com. Thank You."

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message