hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jean-Marc Spaggiari <jean-m...@spaggiari.org>
Subject Re: Snappy compression not working with HBase 0.98.3
Date Mon, 14 Jul 2014 14:53:03 GMT
Hi Hanish,

I don't know if that will help but I wrote that some time ago:
http://www.spaggiari.org/index.php/hbase/how-to-install-snappy-with-1#.U8Pui-9ZuZY

It's most probably the same steps with 0.98. When I need to test I simply
copy the lib/native folder tha tI built for 0.96 to the 0.98 folder...

JM


2014-07-14 10:49 GMT-04:00 Hanish Bansal <hanish.bansal.agarwal@gmail.com>:

> Hi All,
>
> We have tried below things:
>
> 1. Pointed HBase to hadoop and snappy libraries which hadoop holds :
>
> export HBASE_LIBRARY_PATH=/pathtoyourhadoop/lib/native/Linux-amd64-64
>
> As hadoop holds hadoop and snappy library, it should work. But it didn't.
>
> 2. Copied libhadoop.so and libsnappy.so to hbase native library folder
> at $HBASE_HOME/lib/native/Linux-amd64-64/.
>
> It also didn't work.
>
> *Run a compression test using tool, getting below error:*
>
> [root@IMPETUS-I0141 hbase-0.98.3-hadoop2]# bin/hbase
> org.apache.hadoop.hbase.util.
>
> CompressionTest file:///tmp/test.txt snappy
> 2014-07-11 16:05:10,572 INFO  [main] Configuration.deprecation:
> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> 2014-07-11 16:05:11,006 WARN  [main] util.NativeCodeLoader: Unable to load
> native-hadoop library for your platform... using builtin-java classes where
> applicable
> 2014-07-11 16:05:11,241 INFO  [main] util.ChecksumType: Checksum using
> org.apache.hadoop.util.PureJavaCrc32
> 2014-07-11 16:05:11,242 INFO  [main] util.ChecksumType: Checksum can use
> org.apache.hadoop.util.PureJavaCrc32C
> Exception in thread "main" java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>     at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
> Method)
>     at
>
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:62)
>     at
>
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:131)
>     at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:147)
>     at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:162)
>     at
>
> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
>     at
>
> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
>     at
>
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
>     at
>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
>     at
>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
>     at
>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
>     at
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
>     at
>
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
>     at
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
>
> Everything was working fine with hbase-0.94.5 as well as hbase-0.96.1.
>
>
>
>
>
> On Mon, Jul 14, 2014 at 5:40 AM, Stack <stack@duboce.net> wrote:
>
> > On Sun, Jul 13, 2014 at 10:28 PM, Esteban Gutierrez <
> esteban@cloudera.com>
> > wrote:
> >
> > > Hello Ankit,
> > >
> > > The only reason the test can fail in the master is that the snappy
> > natives
> > > libraries are not installed correctly . Have you tried to run the
> > > compression test  (hbase org.apache.hadoop.hbase.util.CompressionTest
> > > file:///tmp snappy) in the master? does it works? If it works correctly
> > > then you only need to restart the HBase masters in order to get it
> > working.
> > >
> > > cheers,
> > > esteban.
> > >
> > >
> > >
> > >
> > What Esteban says.
> >
> > We added a little bit more on how hbase finds native libs to the doc but
> > have not pushed it out to the website.  Perhaps it will help in this case
> > (pardon the formatting):
> >
> >      <note xml:id="hbase.native.platform"><title>On the location of
> native
> > libraries</title>
> >
> >          <para>Hadoop looks in <filename>lib/native</filename>
for .so
> > files.  HBase looks in
> >              <filename>lib/native/PLATFORM</filename>.  See the
> > <command>bin/hbase</command>.
> >              View the file and look for <varname>native</varname>. 
See
> how
> > we
> >              do the work to find out what platform we are running on
> > running a little java program
> >              <classname>org.apache.hadoop.util.PlatformName</classname>
> to
> > figure it out.
> >              We'll then add <filename>./lib/native/PLATFORM</filename>
to
> > the
> >              <varname>LD_LIBRARY_PATH</varname> environment for when
the
> > JVM starts.
> >              The JVM will look in here (as well as in any other dirs
> > specified on LD_LIBRARY_PATH)
> >              for codec native libs.  If you are unable to figure your
> > 'platform', do:
> >              <programlisting>$ ./bin/hbase
> > org.apache.hadoop.util.PlatformName</programlisting>.
> >              An example platform would be
> > <varname>Linux-amd64-64</varname>.
> >              </para>
> >
> >      </note>
> >
> > St.Ack
> >
>
>
>
> --
> *Thanks & Regards*
> *Hanish Bansal*
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message