hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ankit Jain <ankitjainc...@gmail.com>
Subject Re: Snappy compression not working with HBase 0.98.3
Date Sat, 12 Jul 2014 05:11:37 GMT
Hi,

Thanks all for reply..

We have followed the steps mentioned in HBase book. Also, we have set the
HBASE_LIBRARY_PATH into hbase-env.sh configuration file but still getting
the above error.

Regards,
Ankit





On Fri, Jul 11, 2014 at 11:11 PM, Esteban Gutierrez <esteban@cloudera.com>
wrote:

> Hello Hanish,
>
> Since 0.95 a test for compression was added to the HBase Master, now you
> need to make sure the native libraries are installed in the HBase Master(s)
> and not just in the Region Servers. (see HBASE-6370 for details about this
> change)
>
> Regards,
> Esteban.
>
>
>
>
>
> --
> Cloudera, Inc.
>
>
>
> On Fri, Jul 11, 2014 at 7:14 AM, Ted Yu <yuzhihong@gmail.com> wrote:
>
> > Please see
> > http://hbase.apache.org/book.html#snappy.compression.installation
> >
> > Cheers
> >
> >
> > On Fri, Jul 11, 2014 at 3:37 AM, Hanish Bansal <
> > hanish.bansal.agarwal@gmail.com> wrote:
> >
> > > We are using hbase 0.98.3 with hadoop 2.4.0.
> > >
> > > Run a compression test using tool, getting below error:
> > >
> > > [root@IMPETUS-I0141 hbase-0.98.3-hadoop2]# bin/hbase
> > > org.apache.hadoop.hbase.util.CompressionTest file:///tmp/test.txt
> snappy
> > > 2014-07-11 16:05:10,572 INFO  [main] Configuration.deprecation:
> > > hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> > > 2014-07-11 16:05:11,006 WARN  [main] util.NativeCodeLoader: Unable to
> > load
> > > native-hadoop library for your platform... using builtin-java classes
> > where
> > > applicable
> > > 2014-07-11 16:05:11,241 INFO  [main] util.ChecksumType: Checksum using
> > > org.apache.hadoop.util.PureJavaCrc32
> > > 2014-07-11 16:05:11,242 INFO  [main] util.ChecksumType: Checksum can
> use
> > > org.apache.hadoop.util.PureJavaCrc32C
> > > Exception in thread "main" java.lang.UnsatisfiedLinkError:
> > > org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
> > >     at
> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
> > > Method)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:62)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:131)
> > >     at
> > >
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:147)
> > >     at
> > >
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:162)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
> > >     at
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
> > >     at
> > >
> >
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
> > >
> > >
> > >
> > >
> > >
> > > On Fri, Jul 11, 2014 at 10:21 AM, Hanish Bansal <
> > > hanish.bansal.agarwal@gmail.com> wrote:
> > >
> > > > Hi All,
> > > >
> > > > Recently i have upgraded HBase environment from 0.94 to 0.98.3. Now
> > > trying
> > > > to use snappy compression with it.
> > > >
> > > > I have installed snappy library as per guide mentioned in
> > > > https://hbase.apache.org/book/snappy.compression.html
> > > >
> > > > When i am creating a table with snappy compression enabled, i am
> > getting
> > > > below error:
> > > >
> > > >
> > > > hbase(main):001:0> create 'test', {NAME=>'cf1',
> COMPRESSION=>'SNAPPY'}
> > > > 2014-07-08 20:06:33,265 WARN  [main] util.NativeCodeLoader: Unable to
> > > load
> > > > native-hadoop library for your platform... using builtin-java classes
> > > where
> > > > applicable
> > > >
> > > > ERROR: java.io.IOException: Compression algorithm 'snappy' previously
> > > > failed test.
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.util.CompressionTest.testCompression(CompressionTest.java:85)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1774)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1767)
> > > > at
> > org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1749)
> > > > at
> > org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1784)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40470)
> > > > at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2012)
> > > > at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:98)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:73)
> > > > at
> > > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> > > > at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
> > > > at java.util.concurrent.FutureTask.run(FutureTask.java:166)
> > > > at
> > > >
> > >
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> > > > at
> > > >
> > >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> > > > at java.lang.Thread.run(Thread.java:722)
> > > >
> > > >
> > > > Please let me know if anyone aware of this issue.
> > > >
> > > > --
> > > > *Thanks & Regards*
> > > > *Hanish Bansal*
> > > >
> > >
> > >
> > >
> > > --
> > > *Thanks & Regards*
> > > *Hanish Bansal*
> > >
> >
>



-- 
Thanks,
Ankit Jain

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message