hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Kevin O'dell" <kevin.od...@cloudera.com>
Subject Re: How to install Snappy?
Date Mon, 03 Dec 2012 15:19:08 GMT
Never say die!

On Mon, Dec 3, 2012 at 10:15 AM, Jean-Marc Spaggiari <
jean-marc@spaggiari.org> wrote:

> Ok, I got it!!!!
>
> I had to copy the hadoop native libs into the hbase native libs directory!
>
> Now I get a SUCCESS when I'm doint the CompressionTest...
>
> I'm not 100% sure that it's the only think which was missing because I
> have done so many modifications in the last 3 days...
>
> So I will start from a blank 0.94.3 jar and re-do all the steps to
> make sure it's just the native libs which need to be copied.
>
> I was close to surrender ;)
>
> JM
>
> 2012/12/3, Jean-Marc Spaggiari <jean-marc@spaggiari.org>:
> > Hi Kevin,
> >
> > Thanks for the clarification.
> >
> > No, it's not what I'm seeing.
> >
> > Here is what I'm getting:
> >
> > 12/12/03 09:40:42 WARN snappy.LoadSnappy: Snappy native library is
> > available
> > 12/12/03 09:40:42 WARN snappy.LoadSnappy: Snappy native library not
> loaded
> > Exception in thread "main" java.lang.RuntimeException: native snappy
> > library not available
> >       at
> >
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:123)
> >       at
> > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:100)
> >       at
> > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:112)
> >       at
> >
> org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.getCompressor(Compression.java:264)
> >       at
> >
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:739)
> >       at
> >
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:127)
> >       at
> >
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:118)
> >       at
> >
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:101)
> >       at
> >
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:394)
> >       at
> >
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:108)
> >       at
> >
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:138)
> >
> > The most disturbing part is this line:
> > 12/12/03 09:40:42 WARN snappy.LoadSnappy: Snappy native library is
> > available
> >
> > Followed by this one:
> > Exception in thread "main" java.lang.RuntimeException: native snappy
> > library not available
> >
> > Is it available? Or is it not available?
> >
> > I looked in the code and I have no idea why the 2nd one is raised.
> >
> > The code I'm looking at is on the hadoop-snappy site, but the one I
> > have on my server is in the hadoop the hadoop-core-1.0.3.jar file. So
> > maybe that the issue and they are different?
> >
> > I built the hadoop-snappy-0.0.1-SNAPSHOT.jar file too. I placed it on
> > the lib folder and made sure it was taken first, but still not
> > working.
> >
> > So far I think I will stay with GZip until Snappy is integrated on the
> > HBase files...
> >
> > JM
> >
> > 2012/12/3, Kevin O'dell <kevin.odell@cloudera.com>:
> >> Hey JM,
> >>
> >>   Sorry for the quick message earlier.  I tracked down the JIRA I was
> >> referring to: https://issues.apache.org/jira/browse/HBASE-7080
> >>
> >> Does this look like what you are seeing in Compression test?
> >>
> >> On Mon, Dec 3, 2012 at 9:09 AM, Kevin O'dell
> >> <kevin.odell@cloudera.com>wrote:
> >>
> >>> There is a compression test JIRA right now.  What are you seeing?
> >>>
> >>>
> >>> On Mon, Dec 3, 2012 at 8:47 AM, Jean-Marc Spaggiari <
> >>> jean-marc@spaggiari.org> wrote:
> >>>
> >>>> Ok....
> >>>>
> >>>> This: http://code.google.com/p/hadoop-snappy/issues/detail?id=2helped
> >>>> me and my test program is now working. I'm able to load both
> >>>> libraries. Fine.
> >>>>
> >>>> But the CompressionTest is still not working.
> >>>>
> >>>> What is very strange is that:
> >>>> 12/12/03 08:44:24 WARN snappy.LoadSnappy: Snappy native library is
> >>>> available
> >>>> 12/12/03 08:44:24 WARN snappy.LoadSnappy: Snappy native library not
> >>>> loaded
> >>>>
> >>>> It's available, but not loaded.
> >>>>
> >>>> But from the code:
> >>>>   static {
> >>>>     try {
> >>>>       System.loadLibrary("snappy");
> >>>>       System.loadLibrary("hadoopsnappy");
> >>>>       LOG.warn("Snappy native library is available");
> >>>>       AVAILABLE = true;
> >>>>     } catch (UnsatisfiedLinkError ex) {
> >>>>       //NOP
> >>>>     }
> >>>>     LOADED = AVAILABLE;
> >>>>     if (LOADED) {
> >>>>       LOG.info("Snappy native library loaded");
> >>>>     } else {
> >>>>       LOG.warn("Snappy native library not loaded");
> >>>>     }
> >>>>   }
> >>>> If "Snappy native library is available" is displayed, that mean
> >>>> AVAILABLE = true... And if AVAILABLE = true, then LOADED is set to
> >>>> true and Snappy native library loaded must be displayed... But it's
> >>>> not... How is this possible?
> >>>>
> >>>> I have not expected Snappy installation to be such a challenge...
> >>>>
> >>>> I will continue to dig and summarize the steps when I will be done (If
> >>>> I'm able to finish...)
> >>>>
> >>>> JM
> >>>>
> >>>> 2012/12/3, Jean-Marc Spaggiari <jean-marc@spaggiari.org>:
> >>>> > Thanks all for your replies.
> >>>> >
> >>>> > So, to reply to all in one.
> >>>> >
> >>>> > I'm not using CD3. I'm using Hadoop  1.0.3 and HBase 0.94.2 directly
> >>>> > from the JARs.
> >>>> >
> >>>> > Here are all the places where I have put the lib:
> >>>> > /home/hadoop/hadoop-1.0.3/lib/native/Linux-amd64-64/libsnappy.so
> >>>> > /home/hadoop/hadoop-1.0.3/lib/native/Linux-amd64-64/libsnappy.so.1
> >>>> >
> /home/hadoop/hadoop-1.0.3/lib/native/Linux-amd64-64/libsnappy.so.1.1.3
> >>>> > /home/hadoop/hadoop-1.0.3/lib/native/Linux-i386-32/libsnappy.so
> >>>> > /home/hadoop/hadoop-1.0.3/lib/native/Linux-i386-32/libsnappy.so.1
> >>>> >
> /home/hadoop/hadoop-1.0.3/lib/native/Linux-i386-32/libsnappy.so.1.1.3
> >>>> > /home/hbase/hbase-0.94.2/lib/native/libsnappy.so
> >>>> > /home/hbase/hbase-0.94.2/lib/native/libsnappy.so.1
> >>>> > /home/hbase/hbase-0.94.2/lib/native/libsnappy.so.1.1.3
> >>>> > /home/hbase/hbase-0.94.2/lib/native/Linux-amd64-64/libsnappy.so
> >>>> > /home/hbase/hbase-0.94.2/lib/native/Linux-amd64-64/libsnappy.so.1
> >>>> >
> /home/hbase/hbase-0.94.2/lib/native/Linux-amd64-64/libsnappy.so.1.1.3
> >>>> > /home/hbase/hbase-0.94.2/lib/native/Linux-i386-32/libsnappy.so
> >>>> > /home/hbase/hbase-0.94.2/lib/native/Linux-i386-32/libsnappy.so.1
> >>>> > /home/hbase/hbase-0.94.2/lib/native/Linux-i386-32/libsnappy.so.1.1.3
> >>>> > /lib/x86_64-linux-gnu/libsnappy.so
> >>>> > /usr/lib/libsnappy.so
> >>>> > /usr/lib/libsnappy.so.1
> >>>> > /usr/lib/libsnappy.so.1.1.3
> >>>> > /usr/local/lib/libsnappy.so
> >>>> > /usr/local/lib/libsnappy.so.1
> >>>> > /usr/local/lib/libsnappy.so.1.1.3
> >>>> >
> >>>> > I tried to add this on my hbase-env.xml:
> >>>> > export
> >>>> >
> HBASE_LIBRARY_PATH=/home/hbase/hbase-0.94.2/lib/native/Linux-amd64-64
> >>>> >
> >>>> > Before I was trying with doing export on the command line directly
> >>>> > since it seems the hbase script is taking that into consideration
> >>>> > too.
> >>>> >
> >>>> > I have not yet put the hbase.regionserver.codecs line since I still
> >>>> > need to use my cluster until I get snappy working. On the hbase/lib
> >>>> > directory I have snappy-java-1.0.3.2.jar.
> >>>> >
> >>>> >
> >>>> > Should snappy be installed within hbase? Or should it be in hadoop?
> >>>> > I'm not sure anymore.
> >>>> >
> >>>> > But it's still not working. So I tried the small code below:
> >>>> >
> >>>> > import java.util.StringTokenizer;
> >>>> >
> >>>> > public class Test
> >>>> > {
> >>>> >   static {
> >>>> >     try {
> >>>> >       System.loadLibrary("snappy");
> >>>> >       System.loadLibrary("hadoopsnappy");
> >>>> >       System.out.println ("Snappy native library is available");
> >>>> >     } catch (UnsatisfiedLinkError ex) {
> >>>> >         ex.printStackTrace();
> >>>> >     }
> >>>> >   }
> >>>> >
> >>>> >         public static void main (String [] args)
> >>>> >         {
> >>>> >                 System.out.println ("Coucou");
> >>>> > String property = System.getProperty("java.library.path");
> >>>> > StringTokenizer parser = new StringTokenizer(property, ";");
> >>>> > while (parser.hasMoreTokens()) {
> >>>> >     System.err.println(parser.nextToken());
> >>>> >     }
> >>>> >         }
> >>>> > }
> >>>> >
> >>>> >
> >>>> > This code is from org.apache.hadoop.io.compress.snappy.LoadSnappy.
> >>>> > The error I'm getting is java.lang.UnsatisfiedLinkError: no
> >>>> > hadoopsnappy in java.library.path.
> >>>> >
> >>>> > So the issue is not the snappy lib. It' there and working fine.
The
> >>>> > issue is the hadoopsnappy lib which I don't have...
> >>>> >
> >>>> > I found it there: http://code.google.com/p/hadoop-snappy/
> >>>> >
> >>>> > So I have extracted it with svn checkout
> >>>> > http://hadoop-snappy.googlecode.com/svn/trunk/
> >>>> > hadoop-snappy-read-only, tried to built it with mvn package but
it's
> >>>> > failing with something saying "cannot find -ljvm"
> >>>> >
> >>>> > So seems my challenge will be to build hadoop-snappy and not to
> >>>> > install snappy which is already there and working...
> >>>> >
> >>>> > JM
> >>>> >
> >>>> > 2012/12/3, surfer <surfer@crs4.it>:
> >>>> >> hope it helps. this is what I do on apache hadoop 1.0.x and
hbase
> >>>> 0.92.y:
> >>>> >> in hbase-site.xml add:
> >>>> >>
> >>>> >> <property>
> >>>> >> <name>hbase.regionserver.codecs</name>
> >>>> >> <value>snappy</value>
> >>>> >> </property>
> >>>> >>
> >>>> >> copy that file into the hadoop conf directory.
> >>>> >>
> >>>> >> in hbase-env.sh:
> >>>> >> export
> >>>> >> HBASE_LIBRARY_PATH=/pathtoyourhadoop/lib/native/Linux-amd64-64
> >>>> >>
> >>>> >> ( In hbase-env.sh I set also HBASE_HOME, HBASE_CONF_DIR,
> >>>> >> HADOOP_HOME,
> >>>> >> HADOOP_CONF_DIR but I don't know if they contribute to make
snappy
> >>>> >> working...)
> >>>> >>
> >>>> >> in /pathtoyourhadoop/lib/native/Linux-amd64-64 I have:
> >>>> >> libsnappy.a
> >>>> >> libsnappy.so
> >>>> >> libsnappy.so.1
> >>>> >> libsnappy.so.1.1.2
> >>>> >>
> >>>> >> good luck
> >>>> >> giovanni
> >>>> >>
> >>>> >>
> >>>> >>
> >>>> >>
> >>>> >>
> >>>> >>
> >>>> >> On 12/02/2012 02:25 PM, Jean-Marc Spaggiari wrote:
> >>>> >>> So. I spent few hours on that yesterday with no luck.
> >>>> >>>
> >>>> >>> Here is what I did:
> >>>> >>> - Install the google tar, untared, configured, maked and
installed
> >>>> >>> it.
> >>>> >>> - Copied the .so files all over my fs in the os lib dir,
> >>>> >>> HBase/lib/native and subdirs, Hadoop/lib/native and subdirs.
> >>>> >>> - Installed all debian packages with snappy in the name:
> >>>> >>> python-snappy, libsnappy-dev, libsnappy1, libsnappy-java
> >>>> >>>
> >>>> >>> But still exactly the same issue as above. And I don't
have any
> >>>> >>> clue
> >>>> >>> where to dig. There is nothing on internet about that.
> >>>> >>>
> >>>> >>> Anyone faced that already while installing Snappy?
> >>>> >>>
> >>>> >>> JM
> >>>> >>>
> >>>> >>> 2012/12/1, Jean-Marc Spaggiari <jean-marc@spaggiari.org>:
> >>>> >>>> Sorry, I forgot to paste few maybe useful lines. I
have the lib
> in
> >>>> >>>> /usr/local/lib copied properly, and I have the HBASE_LIBRARY_PATH
> >>>> >>>> set
> >>>> >>>> correctly. Do I need to restart HBase to run this test?
> >>>> >>>>
> >>>> >>>> hbase@node3:~/hbase-0.94.2$ export
> >>>> HBASE_LIBRARY_PATH=/usr/local/lib/
> >>>> >>>> hbase@node3:~/hbase-0.94.2$ bin/hbase
> >>>> >>>> org.apache.hadoop.hbase.util.CompressionTest /tmp/test.txt
snappy
> >>>> >>>> 12/12/01 18:55:29 INFO util.ChecksumType:
> >>>> >>>> org.apache.hadoop.util.PureJavaCrc32 not available.
> >>>> >>>> 12/12/01 18:55:29 INFO util.ChecksumType: Checksum
can use
> >>>> >>>> java.util.zip.CRC32
> >>>> >>>> 12/12/01 18:55:29 INFO util.ChecksumType:
> >>>> >>>> org.apache.hadoop.util.PureJavaCrc32C not available.
> >>>> >>>> 12/12/01 18:55:29 DEBUG util.FSUtils: Creating
> >>>> >>>> file:/tmp/test.txtwith
> >>>> >>>> permission:rwxrwxrwx
> >>>> >>>> 12/12/01 18:55:29 WARN util.NativeCodeLoader: Unable
to load
> >>>> >>>> native-hadoop library for your platform... using builtin-java
> >>>> >>>> classes
> >>>> >>>> where applicable
> >>>> >>>> 12/12/01 18:55:29 WARN metrics.SchemaConfigured: Could
not
> >>>> >>>> determine
> >>>> >>>> table and column family of the HFile path /tmp/test.txt.
> Expecting
> >>>> >>>> at
> >>>> >>>> least 5 path components.
> >>>> >>>> 12/12/01 18:55:29 WARN snappy.LoadSnappy: Snappy native
library
> is
> >>>> >>>> available
> >>>> >>>> 12/12/01 18:55:29 WARN snappy.LoadSnappy: Snappy native
library
> >>>> >>>> not
> >>>> >>>> loaded
> >>>> >>>> Exception in thread "main" java.lang.RuntimeException:
native
> >>>> >>>> snappy
> >>>> >>>> library not available
> >>>> >>>>    at
> >>>> >>>>
> >>>>
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:123)
> >>>> >>>>    at
> >>>> >>>>
> >>>>
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:100)
> >>>> >>>>    at
> >>>> >>>>
> >>>>
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:112)
> >>>> >>>>    at
> >>>> >>>>
> >>>>
> org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.getCompressor(Compression.java:264)
> >>>> >>>>    at
> >>>> >>>>
> >>>>
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:739)
> >>>> >>>>    at
> >>>> >>>>
> >>>>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:127)
> >>>> >>>>    at
> >>>> >>>>
> >>>>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:118)
> >>>> >>>>    at
> >>>> >>>>
> >>>>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:101)
> >>>> >>>>    at
> >>>> >>>>
> >>>>
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:394)
> >>>> >>>>    at
> >>>> >>>>
> >>>>
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:108)
> >>>> >>>>    at
> >>>> >>>>
> >>>>
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:138)
> >>>> >>>> hbase@node3:~/hbase-0.94.2$ ll /usr/local/lib/
> >>>> >>>> total 572
> >>>> >>>> -rw-r--r-- 1 root staff 391614 déc  1 18:33 libsnappy.a
> >>>> >>>> -rwxr-xr-x 1 root staff    957 déc  1 18:33 libsnappy.la
> >>>> >>>> lrwxrwxrwx 1 root staff     18 déc  1 18:33 libsnappy.so
->
> >>>> >>>> libsnappy.so.1.1.3
> >>>> >>>> lrwxrwxrwx 1 root staff     18 déc  1 18:33 libsnappy.so.1
->
> >>>> >>>> libsnappy.so.1.1.3
> >>>> >>>> -rwxr-xr-x 1 root staff 178210 déc  1 18:33 libsnappy.so.1.1.3
> >>>> >>>> drwxrwsr-x 4 root staff   4096 jui 13 10:06 python2.6
> >>>> >>>> drwxrwsr-x 4 root staff   4096 jui 13 10:06 python2.7
> >>>> >>>> hbase@node3:~/hbase-0.94.2$
> >>>> >>>>
> >>>> >>>>
> >>>> >>>> 2012/12/1, Jean-Marc Spaggiari <jean-marc@spaggiari.org>:
> >>>> >>>>> Hi,
> >>>> >>>>>
> >>>> >>>>> I'm currently using GZip and want to move to Snappy.
> >>>> >>>>>
> >>>> >>>>> I have downloaded the tar file, extracted, build,
make install,
> >>>> >>>>> make
> >>>> >>>>> check, everything is working fine.
> >>>> >>>>>
> >>>> >>>>> However, I'm not able to get this working:
> >>>> >>>>> bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> >>>> >>>>> /tmp/test.txt
> >>>> >>>>> snappy
> >>>> >>>>> 12/12/01 18:46:21 WARN snappy.LoadSnappy: Snappy
native library
> >>>> >>>>> not
> >>>> >>>>> loaded
> >>>> >>>>> Exception in thread "main" java.lang.RuntimeException:
native
> >>>> >>>>> snappy
> >>>> >>>>> library not available
> >>>> >>>>>   at
> >>>> >>>>>
> >>>>
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:123)
> >>>> >>>>>
> >>>> >>>>> Sound like HBase is not able to find the native
library. How can
> >>>> >>>>> I
> >>>> >>>>> tell HBase where the library is?
> >>>> >>>>>
> >>>> >>>>> Thanks,
> >>>> >>>>>
> >>>> >>>>> JM
> >>>> >>>>>
> >>>> >>
> >>>> >>
> >>>> >
> >>>>
> >>>
> >>>
> >>>
> >>> --
> >>> Kevin O'Dell
> >>> Customer Operations Engineer, Cloudera
> >>>
> >>
> >>
> >>
> >> --
> >> Kevin O'Dell
> >> Customer Operations Engineer, Cloudera
> >>
> >
>



-- 
Kevin O'Dell
Customer Operations Engineer, Cloudera

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message