hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Stack <st...@duboce.net>
Subject Re: How to install Snappy?
Date Mon, 03 Dec 2012 20:15:28 GMT
Thank you JM.  Let me fold it in....
St.Ack


On Mon, Dec 3, 2012 at 11:48 AM, Jean-Marc Spaggiari <
jean-marc@spaggiari.org> wrote:

> Done. HBASE-7264 created. I have added few more details about the
> steps to follow to install Snappy on HBase 0.94.x.
>
> I don't know how to build the documentation locally, so I'm not 100%
> sure about thhe XML structure. I did cut&past so it should be good...
>
> JM
>
> 2012/12/3, Jean-Marc Spaggiari <jean-marc@spaggiari.org>:
> > Sure I will.
> >
> > JM
> >
> > 2012/12/3, Stack <stack@duboce.net>:
> >> Any chance of an update to
> >> http://hbase.apache.org/book.html#snappy.compression ?  If someone
> writes
> >> it up, I'll stitch it in. Thanks,
> >> St.Ack
> >>
> >>
> >> On Mon, Dec 3, 2012 at 6:29 AM, ac@hsk.hk <ac@hsk.hk> wrote:
> >>
> >>> Hi,
> >>>
> >>> Something more about my workaround last time:
> >>>
> >>> I used the following steps to test my workaround:
> >>>
> >>> 1) cd $HBASE_HOME
> >>> ./bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> >>> file:///tmp/testfile lzo
> >>>
> >>>
> >>> 2) hbase shell
> >>> create 't1', {NAME => 'cf1', COMPRESSION => 'LZO'}
> >>>
> >>>
> >>> You could modify above for your test cases.
> >>>
> >>> Good luck
> >>> Thanks
> >>> AC
> >>>
> >>>
> >>> On 3 Dec 2012, at 10:22 PM, ac@hsk.hk wrote:
> >>>
> >>> > Hi JM,
> >>> >
> >>> > I had experienced similar error when I was installing LZO compression
> >>> > to
> >>> RegionServers:
> >>> >
> >>> > Below is from my record about installing LZO:
> >>> > Issue:
> >>> > java.lang.UnsatisfiedLinkError: no gplcompression in
> java.library.path
> >>> > ...
> >>> > 12/11/23 19:03:14 ERROR lzo.LzoCodec: Cannot load native-lzo without
> >>> native-hadoop
> >>> > Exception in thread "main" java.lang.RuntimeException: native-lzo
> >>> library not available
> >>> > Solution:
> >>> > (compiled lzo from source), then do these EXTRA steps:
> >>> > 1)  cp
> >>>
> {working_folder_of_lzo}/hadoop-lzo-master/build/native/Linux-amd64-64/lib/*
> >>> /usr/local/lib/
> >>> > 2) echo export HBASE_LIBRARY_PATH=/usr/local/lib/ >>
> >>> $HBASE_HOME/conf/hbase-env.sh
> >>> > 3) mkdir -p $HBASE_HOME/build
> >>> > 4) cp -r {working_folder_of_lzo}/hadoop-lzo-master/build/native
> >>> $HBASE_HOME/build/native
> >>> >
> >>> >
> >>> >
> >>> > When looking at your email below,  I saw your log also has "The error
> >>> I'm getting is java.lang.UnsatisfiedLinkError: no hadoopsnappy in
> >>> java.library.path."
> >>> > I think you could try these steps:
> >>> > 1)  cp {working_folder_your_snappy}/build/native/Linux-amd64-64/lib/*
> >>> /usr/local/lib/
> >>> > 2) echo export HBASE_LIBRARY_PATH=/usr/local/lib/ >>
> >>> $HBASE_HOME/conf/hbase-env.sh
> >>> > 3) mkdir -p $HBASE_HOME/build
> >>> > 4) cp -r {working_folder_your_snappy}/build/native
> >>> $HBASE_HOME/build/native
> >>> >
> >>> >
> >>> > Good luck.
> >>> > Thanks
> >>> > AC
> >>> >
> >>> >
> >>> >
> >>> > On 3 Dec 2012, at 9:47 PM, Jean-Marc Spaggiari wrote:
> >>> >
> >>> >> Ok....
> >>> >>
> >>> >> This: http://code.google.com/p/hadoop-snappy/issues/detail?id=2
> >>> >> helped
> >>> >> me and my test program is now working. I'm able to load both
> >>> >> libraries. Fine.
> >>> >>
> >>> >> But the CompressionTest is still not working.
> >>> >>
> >>> >> What is very strange is that:
> >>> >> 12/12/03 08:44:24 WARN snappy.LoadSnappy: Snappy native library
is
> >>> available
> >>> >> 12/12/03 08:44:24 WARN snappy.LoadSnappy: Snappy native library
not
> >>> loaded
> >>> >>
> >>> >> It's available, but not loaded.
> >>> >>
> >>> >> But from the code:
> >>> >>  static {
> >>> >>    try {
> >>> >>      System.loadLibrary("snappy");
> >>> >>      System.loadLibrary("hadoopsnappy");
> >>> >>      LOG.warn("Snappy native library is available");
> >>> >>      AVAILABLE = true;
> >>> >>    } catch (UnsatisfiedLinkError ex) {
> >>> >>      //NOP
> >>> >>    }
> >>> >>    LOADED = AVAILABLE;
> >>> >>    if (LOADED) {
> >>> >>      LOG.info("Snappy native library loaded");
> >>> >>    } else {
> >>> >>      LOG.warn("Snappy native library not loaded");
> >>> >>    }
> >>> >>  }
> >>> >> If "Snappy native library is available" is displayed, that mean
> >>> >> AVAILABLE = true... And if AVAILABLE = true, then LOADED is set
to
> >>> >> true and Snappy native library loaded must be displayed... But
it's
> >>> >> not... How is this possible?
> >>> >>
> >>> >> I have not expected Snappy installation to be such a challenge...
> >>> >>
> >>> >> I will continue to dig and summarize the steps when I will be done
> >>> >> (If
> >>> >> I'm able to finish...)
> >>> >>
> >>> >> JM
> >>> >>
> >>> >> 2012/12/3, Jean-Marc Spaggiari <jean-marc@spaggiari.org>:
> >>> >>> Thanks all for your replies.
> >>> >>>
> >>> >>> So, to reply to all in one.
> >>> >>>
> >>> >>> I'm not using CD3. I'm using Hadoop  1.0.3 and HBase 0.94.2
> directly
> >>> >>> from the JARs.
> >>> >>>
> >>> >>> Here are all the places where I have put the lib:
> >>> >>> /home/hadoop/hadoop-1.0.3/lib/native/Linux-amd64-64/libsnappy.so
> >>> >>> /home/hadoop/hadoop-1.0.3/lib/native/Linux-amd64-64/libsnappy.so.1
> >>> >>>
> /home/hadoop/hadoop-1.0.3/lib/native/Linux-amd64-64/libsnappy.so.1.1.3
> >>> >>> /home/hadoop/hadoop-1.0.3/lib/native/Linux-i386-32/libsnappy.so
> >>> >>> /home/hadoop/hadoop-1.0.3/lib/native/Linux-i386-32/libsnappy.so.1
> >>> >>>
> /home/hadoop/hadoop-1.0.3/lib/native/Linux-i386-32/libsnappy.so.1.1.3
> >>> >>> /home/hbase/hbase-0.94.2/lib/native/libsnappy.so
> >>> >>> /home/hbase/hbase-0.94.2/lib/native/libsnappy.so.1
> >>> >>> /home/hbase/hbase-0.94.2/lib/native/libsnappy.so.1.1.3
> >>> >>> /home/hbase/hbase-0.94.2/lib/native/Linux-amd64-64/libsnappy.so
> >>> >>> /home/hbase/hbase-0.94.2/lib/native/Linux-amd64-64/libsnappy.so.1
> >>> >>>
> /home/hbase/hbase-0.94.2/lib/native/Linux-amd64-64/libsnappy.so.1.1.3
> >>> >>> /home/hbase/hbase-0.94.2/lib/native/Linux-i386-32/libsnappy.so
> >>> >>> /home/hbase/hbase-0.94.2/lib/native/Linux-i386-32/libsnappy.so.1
> >>> >>>
> /home/hbase/hbase-0.94.2/lib/native/Linux-i386-32/libsnappy.so.1.1.3
> >>> >>> /lib/x86_64-linux-gnu/libsnappy.so
> >>> >>> /usr/lib/libsnappy.so
> >>> >>> /usr/lib/libsnappy.so.1
> >>> >>> /usr/lib/libsnappy.so.1.1.3
> >>> >>> /usr/local/lib/libsnappy.so
> >>> >>> /usr/local/lib/libsnappy.so.1
> >>> >>> /usr/local/lib/libsnappy.so.1.1.3
> >>> >>>
> >>> >>> I tried to add this on my hbase-env.xml:
> >>> >>> export
> >>> >>>
> HBASE_LIBRARY_PATH=/home/hbase/hbase-0.94.2/lib/native/Linux-amd64-64
> >>> >>>
> >>> >>> Before I was trying with doing export on the command line directly
> >>> >>> since it seems the hbase script is taking that into consideration
> >>> >>> too.
> >>> >>>
> >>> >>> I have not yet put the hbase.regionserver.codecs line since
I still
> >>> >>> need to use my cluster until I get snappy working. On the hbase/lib
> >>> >>> directory I have snappy-java-1.0.3.2.jar.
> >>> >>>
> >>> >>>
> >>> >>> Should snappy be installed within hbase? Or should it be in
hadoop?
> >>> >>> I'm not sure anymore.
> >>> >>>
> >>> >>> But it's still not working. So I tried the small code below:
> >>> >>>
> >>> >>> import java.util.StringTokenizer;
> >>> >>>
> >>> >>> public class Test
> >>> >>> {
> >>> >>>  static {
> >>> >>>    try {
> >>> >>>      System.loadLibrary("snappy");
> >>> >>>      System.loadLibrary("hadoopsnappy");
> >>> >>>      System.out.println ("Snappy native library is available");
> >>> >>>    } catch (UnsatisfiedLinkError ex) {
> >>> >>>        ex.printStackTrace();
> >>> >>>    }
> >>> >>>  }
> >>> >>>
> >>> >>>        public static void main (String [] args)
> >>> >>>        {
> >>> >>>                System.out.println ("Coucou");
> >>> >>> String property = System.getProperty("java.library.path");
> >>> >>> StringTokenizer parser = new StringTokenizer(property, ";");
> >>> >>> while (parser.hasMoreTokens()) {
> >>> >>>    System.err.println(parser.nextToken());
> >>> >>>    }
> >>> >>>        }
> >>> >>> }
> >>> >>>
> >>> >>>
> >>> >>> This code is from org.apache.hadoop.io.compress.snappy.LoadSnappy.
> >>> >>> The error I'm getting is java.lang.UnsatisfiedLinkError: no
> >>> >>> hadoopsnappy in java.library.path.
> >>> >>>
> >>> >>> So the issue is not the snappy lib. It' there and working fine.
The
> >>> >>> issue is the hadoopsnappy lib which I don't have...
> >>> >>>
> >>> >>> I found it there: http://code.google.com/p/hadoop-snappy/
> >>> >>>
> >>> >>> So I have extracted it with svn checkout
> >>> >>> http://hadoop-snappy.googlecode.com/svn/trunk/
> >>> >>> hadoop-snappy-read-only, tried to built it with mvn package
but
> it's
> >>> >>> failing with something saying "cannot find -ljvm"
> >>> >>>
> >>> >>> So seems my challenge will be to build hadoop-snappy and not
to
> >>> >>> install snappy which is already there and working...
> >>> >>>
> >>> >>> JM
> >>> >>>
> >>> >>> 2012/12/3, surfer <surfer@crs4.it>:
> >>> >>>> hope it helps. this is what I do on apache hadoop 1.0.x
and hbase
> >>> 0.92.y:
> >>> >>>> in hbase-site.xml add:
> >>> >>>>
> >>> >>>> <property>
> >>> >>>> <name>hbase.regionserver.codecs</name>
> >>> >>>> <value>snappy</value>
> >>> >>>> </property>
> >>> >>>>
> >>> >>>> copy that file into the hadoop conf directory.
> >>> >>>>
> >>> >>>> in hbase-env.sh:
> >>> >>>> export
> >>> >>>> HBASE_LIBRARY_PATH=/pathtoyourhadoop/lib/native/Linux-amd64-64
> >>> >>>>
> >>> >>>> ( In hbase-env.sh I set also HBASE_HOME, HBASE_CONF_DIR,
> >>> >>>> HADOOP_HOME,
> >>> >>>> HADOOP_CONF_DIR but I don't know if they contribute to
make snappy
> >>> >>>> working...)
> >>> >>>>
> >>> >>>> in /pathtoyourhadoop/lib/native/Linux-amd64-64 I have:
> >>> >>>> libsnappy.a
> >>> >>>> libsnappy.so
> >>> >>>> libsnappy.so.1
> >>> >>>> libsnappy.so.1.1.2
> >>> >>>>
> >>> >>>> good luck
> >>> >>>> giovanni
> >>> >>>>
> >>> >>>>
> >>> >>>>
> >>> >>>>
> >>> >>>>
> >>> >>>>
> >>> >>>> On 12/02/2012 02:25 PM, Jean-Marc Spaggiari wrote:
> >>> >>>>> So. I spent few hours on that yesterday with no luck.
> >>> >>>>>
> >>> >>>>> Here is what I did:
> >>> >>>>> - Install the google tar, untared, configured, maked
and
> installed
> >>> it.
> >>> >>>>> - Copied the .so files all over my fs in the os lib
dir,
> >>> >>>>> HBase/lib/native and subdirs, Hadoop/lib/native and
subdirs.
> >>> >>>>> - Installed all debian packages with snappy in the
name:
> >>> >>>>> python-snappy, libsnappy-dev, libsnappy1, libsnappy-java
> >>> >>>>>
> >>> >>>>> But still exactly the same issue as above. And I don't
have any
> >>> >>>>> clue
> >>> >>>>> where to dig. There is nothing on internet about that.
> >>> >>>>>
> >>> >>>>> Anyone faced that already while installing Snappy?
> >>> >>>>>
> >>> >>>>> JM
> >>> >>>>>
> >>> >>>>> 2012/12/1, Jean-Marc Spaggiari <jean-marc@spaggiari.org>:
> >>> >>>>>> Sorry, I forgot to paste few maybe useful lines.
I have the lib
> >>> >>>>>> in
> >>> >>>>>> /usr/local/lib copied properly, and I have the
> HBASE_LIBRARY_PATH
> >>> set
> >>> >>>>>> correctly. Do I need to restart HBase to run this
test?
> >>> >>>>>>
> >>> >>>>>> hbase@node3:~/hbase-0.94.2$ export
> >>> HBASE_LIBRARY_PATH=/usr/local/lib/
> >>> >>>>>> hbase@node3:~/hbase-0.94.2$ bin/hbase
> >>> >>>>>> org.apache.hadoop.hbase.util.CompressionTest /tmp/test.txt
> snappy
> >>> >>>>>> 12/12/01 18:55:29 INFO util.ChecksumType:
> >>> >>>>>> org.apache.hadoop.util.PureJavaCrc32 not available.
> >>> >>>>>> 12/12/01 18:55:29 INFO util.ChecksumType: Checksum
can use
> >>> >>>>>> java.util.zip.CRC32
> >>> >>>>>> 12/12/01 18:55:29 INFO util.ChecksumType:
> >>> >>>>>> org.apache.hadoop.util.PureJavaCrc32C not available.
> >>> >>>>>> 12/12/01 18:55:29 DEBUG util.FSUtils: Creating
> >>> file:/tmp/test.txtwith
> >>> >>>>>> permission:rwxrwxrwx
> >>> >>>>>> 12/12/01 18:55:29 WARN util.NativeCodeLoader: Unable
to load
> >>> >>>>>> native-hadoop library for your platform... using
builtin-java
> >>> classes
> >>> >>>>>> where applicable
> >>> >>>>>> 12/12/01 18:55:29 WARN metrics.SchemaConfigured:
Could not
> >>> >>>>>> determine
> >>> >>>>>> table and column family of the HFile path /tmp/test.txt.
> >>> >>>>>> Expecting
> >>> at
> >>> >>>>>> least 5 path components.
> >>> >>>>>> 12/12/01 18:55:29 WARN snappy.LoadSnappy: Snappy
native library
> >>> >>>>>> is
> >>> >>>>>> available
> >>> >>>>>> 12/12/01 18:55:29 WARN snappy.LoadSnappy: Snappy
native library
> >>> >>>>>> not
> >>> >>>>>> loaded
> >>> >>>>>> Exception in thread "main" java.lang.RuntimeException:
native
> >>> >>>>>> snappy
> >>> >>>>>> library not available
> >>> >>>>>>  at
> >>> >>>>>>
> >>>
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:123)
> >>> >>>>>>  at
> >>> >>>>>>
> >>>
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:100)
> >>> >>>>>>  at
> >>> >>>>>>
> >>>
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:112)
> >>> >>>>>>  at
> >>> >>>>>>
> >>>
> org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.getCompressor(Compression.java:264)
> >>> >>>>>>  at
> >>> >>>>>>
> >>>
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:739)
> >>> >>>>>>  at
> >>> >>>>>>
> >>>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:127)
> >>> >>>>>>  at
> >>> >>>>>>
> >>>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:118)
> >>> >>>>>>  at
> >>> >>>>>>
> >>>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:101)
> >>> >>>>>>  at
> >>> >>>>>>
> >>>
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:394)
> >>> >>>>>>  at
> >>> >>>>>>
> >>>
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:108)
> >>> >>>>>>  at
> >>> >>>>>>
> >>>
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:138)
> >>> >>>>>> hbase@node3:~/hbase-0.94.2$ ll /usr/local/lib/
> >>> >>>>>> total 572
> >>> >>>>>> -rw-r--r-- 1 root staff 391614 déc  1 18:33 libsnappy.a
> >>> >>>>>> -rwxr-xr-x 1 root staff    957 déc  1 18:33 libsnappy.la
> >>> >>>>>> lrwxrwxrwx 1 root staff     18 déc  1 18:33 libsnappy.so
->
> >>> >>>>>> libsnappy.so.1.1.3
> >>> >>>>>> lrwxrwxrwx 1 root staff     18 déc  1 18:33 libsnappy.so.1
->
> >>> >>>>>> libsnappy.so.1.1.3
> >>> >>>>>> -rwxr-xr-x 1 root staff 178210 déc  1 18:33 libsnappy.so.1.1.3
> >>> >>>>>> drwxrwsr-x 4 root staff   4096 jui 13 10:06 python2.6
> >>> >>>>>> drwxrwsr-x 4 root staff   4096 jui 13 10:06 python2.7
> >>> >>>>>> hbase@node3:~/hbase-0.94.2$
> >>> >>>>>>
> >>> >>>>>>
> >>> >>>>>> 2012/12/1, Jean-Marc Spaggiari <jean-marc@spaggiari.org>:
> >>> >>>>>>> Hi,
> >>> >>>>>>>
> >>> >>>>>>> I'm currently using GZip and want to move to
Snappy.
> >>> >>>>>>>
> >>> >>>>>>> I have downloaded the tar file, extracted,
build, make install,
> >>> make
> >>> >>>>>>> check, everything is working fine.
> >>> >>>>>>>
> >>> >>>>>>> However, I'm not able to get this working:
> >>> >>>>>>> bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> >>> /tmp/test.txt
> >>> >>>>>>> snappy
> >>> >>>>>>> 12/12/01 18:46:21 WARN snappy.LoadSnappy: Snappy
native library
> >>> >>>>>>> not
> >>> >>>>>>> loaded
> >>> >>>>>>> Exception in thread "main" java.lang.RuntimeException:
native
> >>> snappy
> >>> >>>>>>> library not available
> >>> >>>>>>>         at
> >>> >>>>>>>
> >>>
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:123)
> >>> >>>>>>>
> >>> >>>>>>> Sound like HBase is not able to find the native
library. How
> can
> >>> >>>>>>> I
> >>> >>>>>>> tell HBase where the library is?
> >>> >>>>>>>
> >>> >>>>>>> Thanks,
> >>> >>>>>>>
> >>> >>>>>>> JM
> >>> >>>>>>>
> >>> >>>>
> >>> >>>>
> >>> >>>
> >>> >
> >>>
> >>>
> >>
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message