hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jean-Marc Spaggiari <jean-m...@spaggiari.org>
Subject Re: Compilation error: HBASE 0.98.4 with Snappy
Date Wed, 27 Aug 2014 10:41:15 GMT
Hi Arthur,

Glad to hear you got it!

Regarding #2, was JAVA_LIBRARY_PATH already set before? If so, that might
have been the issue. HBase will append to this path all what it needs (if
required) so I don't think there is anything else you need to add.

Regarding #1  I don't think it's an error. Might maybe more be a warning.
Will look at it to see where it comes form...

JM


2014-08-27 4:00 GMT-04:00 Arthur.hk.chan@gmail.com <arthur.hk.chan@gmail.com
>:

> Hi,
>
> Many thanks for your advices!
>
> Finally, I managed to make it work.
>
> I needed to add:
> export JAVA_LIBRARY_PATH="$HBASE_HOME/lib/native/Linux-amd64-64”
>
> then run:
> bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> file:///tmp/snappy-test snappy
> 2014-08-27 15:51:39,459 INFO  [main] Configuration.deprecation:
> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in
> [jar:file:/mnt/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in
> [jar:file:/mnt/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> 2014-08-27 15:51:39,785 INFO  [main] util.ChecksumType: Checksum using
> org.apache.hadoop.util.PureJavaCrc32
> 2014-08-27 15:51:39,786 INFO  [main] util.ChecksumType: Checksum can use
> org.apache.hadoop.util.PureJavaCrc32C
> 2014-08-27 15:51:39,926 INFO  [main] compress.CodecPool: Got brand-new
> compressor [.snappy]
> 2014-08-27 15:51:39,930 INFO  [main] compress.CodecPool: Got brand-new
> compressor [.snappy]
> 2014-08-27 15:51:39,934 ERROR [main] hbase.KeyValue: Unexpected
> getShortMidpointKey result, fakeKey:testkey, firstKeyInBlock:testkey
> 2014-08-27 15:51:40,185 INFO  [main] compress.CodecPool: Got brand-new
> decompressor [.snappy]
> SUCCESS
>
>
> bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> file:///tmp/snappy-test gz
> 2014-08-27 15:57:18,633 INFO  [main] Configuration.deprecation:
> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in
> [jar:file:/mnt/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in
> [jar:file:/mnt/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> 2014-08-27 15:57:18,969 INFO  [main] util.ChecksumType: Checksum using
> org.apache.hadoop.util.PureJavaCrc32
> 2014-08-27 15:57:18,970 INFO  [main] util.ChecksumType: Checksum can use
> org.apache.hadoop.util.PureJavaCrc32C
> 2014-08-27 15:57:19,127 INFO  [main] zlib.ZlibFactory: Successfully loaded
> & initialized native-zlib library
> 2014-08-27 15:57:19,146 INFO  [main] compress.CodecPool: Got brand-new
> compressor [.gz]
> 2014-08-27 15:57:19,149 INFO  [main] compress.CodecPool: Got brand-new
> compressor [.gz]
> 2014-08-27 15:57:19,153 ERROR [main] hbase.KeyValue: Unexpected
> getShortMidpointKey result, fakeKey:testkey, firstKeyInBlock:testkey
> 2014-08-27 15:57:19,401 INFO  [main] compress.CodecPool: Got brand-new
> decompressor [.gz]
> SUCCESS
>
>
> 2 questions:
> 1) Is this OK if “SUCCESS" with "ERROR [main] hbase.KeyValue: Unexpected
> getShortMidpointKey result, fakeKey:testkey, firstKeyInBlock:test key”
> 2) is this extra setting of “JAVA_LIBRARY_PATH” a good way for setting up
> snappy with Hadoop 2.4.1 and HBase 0.98.4?
>
>
> Regards
> Arthur
>
>
>
> On 27 Aug, 2014, at 1:13 pm, Arthur.hk.chan@gmail.com <
> arthur.hk.chan@gmail.com> wrote:
>
> > Hi,
> >
> > Thanks!  tried but still same error:
> >
> > rm hadoop-2.4.1-src -Rf
>
>          // delete all old src files and try again
> > tar -vxf hadoop-2.4.1-src.tar.gz
> > cd hadoop-2.4.1-src
> > mvn -DskipTests clean install -Drequire.snappy=true​-Pnative
>                                                       // compile with snappy
> > [INFO]
> > [INFO] Apache Hadoop Main ................................ SUCCESS
> [0.887s]
> > [INFO] Apache Hadoop Project POM ......................... SUCCESS
> [0.306s]
> > [INFO] Apache Hadoop Annotations ......................... SUCCESS
> [0.859s]
> > [INFO] Apache Hadoop Project Dist POM .................... SUCCESS
> [0.231s]
> > [INFO] Apache Hadoop Assemblies .......................... SUCCESS
> [0.071s]
> > [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS
> [0.960s]
> > [INFO] Apache Hadoop MiniKDC ............................. SUCCESS
> [0.711s]
> > [INFO] Apache Hadoop Auth ................................ SUCCESS
> [0.641s]
> > [INFO] Apache Hadoop Auth Examples ....................... SUCCESS
> [0.528s]
> > [INFO] Apache Hadoop Common .............................. SUCCESS
> [7.859s]
> > [INFO] Apache Hadoop NFS ................................. SUCCESS
> [0.282s]
> > [INFO] Apache Hadoop Common Project ...................... SUCCESS
> [0.013s]
> > [INFO] Apache Hadoop HDFS ................................ SUCCESS
> [14.210s]
> > [INFO] Apache Hadoop HttpFS .............................. SUCCESS
> [1.322s]
> > [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS
> [0.418s]
> > [INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS
> [0.178s]
> > [INFO] Apache Hadoop HDFS Project ........................ SUCCESS
> [0.016s]
> > [INFO] hadoop-yarn ....................................... SUCCESS
> [0.014s]
> > [INFO] hadoop-yarn-api ................................... SUCCESS
> [3.012s]
> > [INFO] hadoop-yarn-common ................................ SUCCESS
> [1.173s]
> > [INFO] hadoop-yarn-server ................................ SUCCESS
> [0.029s]
> > [INFO] hadoop-yarn-server-common ......................... SUCCESS
> [0.379s]
> > [INFO] hadoop-yarn-server-nodemanager .................... SUCCESS
> [0.612s]
> > [INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS
> [0.166s]
> > [INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS
> [0.213s]
> > [INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS
> [0.970s]
> > [INFO] hadoop-yarn-server-tests .......................... SUCCESS
> [0.158s]
> > [INFO] hadoop-yarn-client ................................ SUCCESS
> [0.227s]
> > [INFO] hadoop-yarn-applications .......................... SUCCESS
> [0.013s]
> > [INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS
> [0.157s]
> > [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS
> [0.094s]
> > [INFO] hadoop-yarn-site .................................. SUCCESS
> [0.024s]
> > [INFO] hadoop-yarn-project ............................... SUCCESS
> [0.030s]
> > [INFO] hadoop-mapreduce-client ........................... SUCCESS
> [0.027s]
> > [INFO] hadoop-mapreduce-client-core ...................... SUCCESS
> [1.206s]
> > [INFO] hadoop-mapreduce-client-common .................... SUCCESS
> [1.140s]
> > [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS
> [0.128s]
> > [INFO] hadoop-mapreduce-client-app ....................... SUCCESS
> [0.634s]
> > [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS
> [0.557s]
> > [INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS
> [0.882s]
> > [INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS
> [0.085s]
> > [INFO] Apache Hadoop MapReduce Examples .................. SUCCESS
> [0.224s]
> > [INFO] hadoop-mapreduce .................................. SUCCESS
> [0.030s]
> > [INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS
> [0.200s]
> > [INFO] Apache Hadoop Distributed Copy .................... SUCCESS
> [0.656s]
> > [INFO] Apache Hadoop Archives ............................ SUCCESS
> [0.112s]
> > [INFO] Apache Hadoop Rumen ............................... SUCCESS
> [0.246s]
> > [INFO] Apache Hadoop Gridmix ............................. SUCCESS
> [0.283s]
> > [INFO] Apache Hadoop Data Join ........................... SUCCESS
> [0.111s]
> > [INFO] Apache Hadoop Extras .............................. SUCCESS
> [0.146s]
> > [INFO] Apache Hadoop Pipes ............................... SUCCESS
> [0.011s]
> > [INFO] Apache Hadoop OpenStack support ................... SUCCESS
> [0.283s]
> > [INFO] Apache Hadoop Client .............................. SUCCESS
> [0.106s]
> > [INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS
> [0.038s]
> > [INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS
> [0.223s]
> > [INFO] Apache Hadoop Tools Dist .......................... SUCCESS
> [0.106s]
> > [INFO] Apache Hadoop Tools ............................... SUCCESS
> [0.010s]
> > [INFO] Apache Hadoop Distribution ........................ SUCCESS
> [0.034s]
> > [INFO]
> ------------------------------------------------------------------------
> > [INFO] BUILD SUCCESS
> > [INFO]
> ------------------------------------------------------------------------
> > [INFO] Total time: 45.478s
> > [INFO] Finished at: Wed Aug 27 12:10:06 HKT 2014
> > [INFO] Final Memory: 107M/1898M
> > [INFO]
> ------------------------------------------------------------------------
> > mvn package -Pdist,native -DskipTests -Dtar -Drequire.snappy=true
>                                                              // package it
> with snappy
> > [INFO]
> ------------------------------------------------------------------------
> > [INFO] Reactor Summary:
> > [INFO]
> > [INFO] Apache Hadoop Main ................................ SUCCESS
> [0.727s]
> > [INFO] Apache Hadoop Project POM ......................... SUCCESS
> [0.555s]
> > [INFO] Apache Hadoop Annotations ......................... SUCCESS
> [1.011s]
> > [INFO] Apache Hadoop Assemblies .......................... SUCCESS
> [0.128s]
> > [INFO] Apache Hadoop Project Dist POM .................... SUCCESS
> [1.342s]
> > [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS
> [1.251s]
> > [INFO] Apache Hadoop MiniKDC ............................. SUCCESS
> [1.007s]
> > [INFO] Apache Hadoop Auth ................................ SUCCESS
> [1.252s]
> > [INFO] Apache Hadoop Auth Examples ....................... SUCCESS
> [0.929s]
> > [INFO] Apache Hadoop Common .............................. SUCCESS
> [41.330s]
> > [INFO] Apache Hadoop NFS ................................. SUCCESS
> [1.986s]
> > [INFO] Apache Hadoop Common Project ...................... SUCCESS
> [0.015s]
> > [INFO] Apache Hadoop HDFS ................................ SUCCESS
> [1:08.367s]
> > [INFO] Apache Hadoop HttpFS .............................. SUCCESS
> [47.198s]
> > [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS
> [2.807s]
> > [INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS
> [1.350s]
> > [INFO] Apache Hadoop HDFS Project ........................ SUCCESS
> [0.027s]
> > [INFO] hadoop-yarn ....................................... SUCCESS
> [0.013s]
> > [INFO] hadoop-yarn-api ................................... SUCCESS
> [36.848s]
> > [INFO] hadoop-yarn-common ................................ SUCCESS
> [12.502s]
> > [INFO] hadoop-yarn-server ................................ SUCCESS
> [0.032s]
> > [INFO] hadoop-yarn-server-common ......................... SUCCESS
> [3.688s]
> > [INFO] hadoop-yarn-server-nodemanager .................... SUCCESS
> [8.207s]
> > [INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS
> [1.048s]
> > [INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS
> [1.839s]
> > [INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS
> [4.766s]
> > [INFO] hadoop-yarn-server-tests .......................... SUCCESS
> [0.247s]
> > [INFO] hadoop-yarn-client ................................ SUCCESS
> [1.735s]
> > [INFO] hadoop-yarn-applications .......................... SUCCESS
> [0.013s]
> > [INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS
> [0.984s]
> > [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS
> [0.792s]
> > [INFO] hadoop-yarn-site .................................. SUCCESS
> [0.034s]
> > [INFO] hadoop-yarn-project ............................... SUCCESS
> [3.327s]
> > [INFO] hadoop-mapreduce-client ........................... SUCCESS
> [0.090s]
> > [INFO] hadoop-mapreduce-client-core ...................... SUCCESS
> [7.451s]
> > [INFO] hadoop-mapreduce-client-common .................... SUCCESS
> [7.081s]
> > [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS
> [0.972s]
> > [INFO] hadoop-mapreduce-client-app ....................... SUCCESS
> [3.085s]
> > [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS
> [3.119s]
> > [INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS
> [1.934s]
> > [INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS
> [0.772s]
> > [INFO] Apache Hadoop MapReduce Examples .................. SUCCESS
> [2.162s]
> > [INFO] hadoop-mapreduce .................................. SUCCESS
> [2.622s]
> > [INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS
> [1.744s]
> > [INFO] Apache Hadoop Distributed Copy .................... SUCCESS
> [4.466s]
> > [INFO] Apache Hadoop Archives ............................ SUCCESS
> [0.956s]
> > [INFO] Apache Hadoop Rumen ............................... SUCCESS
> [2.203s]
> > [INFO] Apache Hadoop Gridmix ............................. SUCCESS
> [1.509s]
> > [INFO] Apache Hadoop Data Join ........................... SUCCESS
> [0.909s]
> > [INFO] Apache Hadoop Extras .............................. SUCCESS
> [1.103s]
> > [INFO] Apache Hadoop Pipes ............................... SUCCESS
> [4.794s]
> > [INFO] Apache Hadoop OpenStack support ................... SUCCESS
> [2.111s]
> > [INFO] Apache Hadoop Client .............................. SUCCESS
> [3.919s]
> > [INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS
> [0.044s]
> > [INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS
> [1.665s]
> > [INFO] Apache Hadoop Tools Dist .......................... SUCCESS
> [3.936s]
> > [INFO] Apache Hadoop Tools ............................... SUCCESS
> [0.042s]
> > [INFO] Apache Hadoop Distribution ........................ SUCCESS
> [15.208s]
> > [INFO]
> ------------------------------------------------------------------------
> > [INFO] BUILD SUCCESS
> > [INFO]
> ------------------------------------------------------------------------
> > [INFO] Total time: 5:22.529s
> > [INFO] Finished at: Wed Aug 27 12:17:06 HKT 2014
> > [INFO] Final Memory: 86M/755M
> > [INFO]
> ------------------------------------------------------------------------
> >
> > ll
> hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/
> > -rw-rw-r--. 1 hduser hadoop 1062640 Aug 27 12:12 libhadoop.a
> > lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 12:12 libhadoop.so ->
> libhadoop.so.1.0.0
> > -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:12 libhadoop.so.1.0.0
> >
> > (copy them to $HADOOP_HOME/lib and $HBASE_HOME/lib)
> > cp
> hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/*
> $HADOOP_HOME/lib/native/Linux-amd64-64/
> > cp
> hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/*
> $HBASE_HOME/lib/native/Linux-amd64-64/
> >
> > ll $HADOOP_HOME/lib/native/Linux-amd64-64/
> > total 21236
> > -rw-rw-r--. 1 hduser hadoop 1062640 Aug 27 12:19 libhadoop.a
>                                                       // new
> > lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 06:54 libhadoopsnappy.so ->
> libhadoopsnappy.so.0.0.1
> > lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 06:54 libhadoopsnappy.so.0 ->
> libhadoopsnappy.so.0.0.1
> > -rwxr-xr-x. 1 hduser hadoop   54961 Aug 27 06:54 libhadoopsnappy.so.0.0.1
> > -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so
>                                                      // new
> > -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so.1.0.0
>                                                      // new
> > lrwxrwxrwx. 1 hduser hadoop      55 Aug 27 06:54 libjvm.so ->
> /usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so
> > lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 06:54 libprotobuf-lite.so ->
> libprotobuf-lite.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 06:54 libprotobuf-lite.so.8
> -> libprotobuf-lite.so.8.0.0
> > -rwxr-xr-x. 1 hduser hadoop  964689 Aug 27 06:54
> libprotobuf-lite.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 06:54 libprotobuf.so ->
> libprotobuf.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 06:54 libprotobuf.so.8 ->
> libprotobuf.so.8.0.0
> > -rwxr-xr-x. 1 hduser hadoop 8300050 Aug 27 06:54 libprotobuf.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 06:54 libprotoc.so ->
> libprotoc.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 06:54 libprotoc.so.8 ->
> libprotoc.so.8.0.0
> > -rwxr-xr-x. 1 hduser hadoop 9935810 Aug 27 06:54 libprotoc.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:31 libsnappy.so ->
> /usr/lib64/libsnappy.so
> > lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:32 libsnappy.so.1 ->
> /usr/lib64/libsnappy.so
> > -rwxr-xr-x. 1 hduser hadoop  147726 Aug 27 06:54 libsnappy.so.1.2.0
> > drwxr-xr-x. 2 hduser hadoop    4096 Aug 27 11:15 pkgconfig
> >
> >
> > ll $HBASE_HOME/lib/native/Linux-amd64-64/
> > -rw-rw-r--. 1 hduser hadoop 1062640 Aug 27 12:19 libhadoop.a
>                                                       // new
> > -rw-rw-r--. 1 hduser hadoop 1487564 Aug 27 11:14 libhadooppipes.a
> > lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 07:08 libhadoopsnappy.so ->
> libhadoopsnappy.so.0.0.1
> > lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 07:08 libhadoopsnappy.so.0 ->
> libhadoopsnappy.so.0.0.1
> > -rwxr-xr-x. 1 hduser hadoop   54961 Aug 27 07:08 libhadoopsnappy.so.0.0.1
> > -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so
>                                                      // new
> > -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so.1.0.0
>                                                      // new
> > -rw-rw-r--. 1 hduser hadoop  582472 Aug 27 11:14 libhadooputils.a
> > -rw-rw-r--. 1 hduser hadoop  298626 Aug 27 11:14 libhdfs.a
> > -rwxrwxr-x. 1 hduser hadoop  200370 Aug 27 11:14 libhdfs.so
> > -rwxrwxr-x. 1 hduser hadoop  200370 Aug 27 11:14 libhdfs.so.0.0.0
> > lrwxrwxrwx. 1 hduser hadoop      55 Aug 27 07:08 libjvm.so ->
> /usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so
> > lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 07:08 libprotobuf-lite.so ->
> libprotobuf-lite.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 07:08 libprotobuf-lite.so.8
> -> libprotobuf-lite.so.8.0.0
> > -rwxr-xr-x. 1 hduser hadoop  964689 Aug 27 07:08
> libprotobuf-lite.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 07:08 libprotobuf.so ->
> libprotobuf.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 07:08 libprotobuf.so.8 ->
> libprotobuf.so.8.0.0
> > -rwxr-xr-x. 1 hduser hadoop 8300050 Aug 27 07:08 libprotobuf.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 07:08 libprotoc.so ->
> libprotoc.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 07:08 libprotoc.so.8 ->
> libprotoc.so.8.0.0
> > -rwxr-xr-x. 1 hduser hadoop 9935810 Aug 27 07:08 libprotoc.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:32 libsnappy.so ->
> /usr/lib64/libsnappy.so
> > lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:33 libsnappy.so.1 ->
> /usr/lib64/libsnappy.so
> > -rwxr-xr-x. 1 hduser hadoop  147726 Aug 27 07:08 libsnappy.so.1.2.0
> > drwxr-xr-x. 2 hduser hadoop    4096 Aug 27 07:08 pkgconfig
> >
> >
> >
> > sudo yum install snappy snappy-devel
> > Loaded plugins: fastestmirror, security
> > Loading mirror speeds from cached hostfile
> >  ...
> > Package snappy-1.1.0-1.el6.x86_64 already installed and latest version
> > Package snappy-devel-1.1.0-1.el6.x86_64 already installed and latest
> version
> > Nothing to do
> >
> >
> > ln -sf /usr/lib64/libsnappy.so $HADOOP_HOME/lib/native/Linux-amd64-64/.
> > ln -sf /usr/lib64/libsnappy.so $HBASE_HOME/lib/native/Linux-amd64-64/.
> >
> > ll $HADOOP_HOME/lib/native/Linux-amd64-64/libsnappy.so
> > lrwxrwxrwx. 1 hduser hadoop 23 Aug 27 11:31
> $HADOOP_HOME/lib/native/Linux-amd64-64/libsnappy.so ->
> /usr/lib64/libsnappy.s
> > ll $HBASE_HOME/lib/native/Linux-amd64-64/libsnappy.so
> > lrwxrwxrwx. 1 hduser hadoop 23 Aug 27 11:32
> $HBASE_HOME/lib/native/Linux-amd64-64/libsnappy.so ->
> /usr/lib64/libsnappy.so
> >
> >
> >
> > ($HADOOP_HOME/etc/hadoop/hadoop-env.sh  added following)
> > ### 2014-08-27
> > export
> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/
> > ###
> >
> > ($HBASE_HOME/conf/hbase-env.sh added following)
> > ### 2014-08-27
> > export
> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/
> > export
> HBASE_LIBRARY_PATH=$HBASE_LIBRARY_PATH:$HBASE_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/:$HBASE_HOME/lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
> > export CLASSPATH=$CLASSPATH:$HBASE_LIBRARY_PATH
> > export HBASE_CLASSPATH=$HBASE_CLASSPATH:$HBASE_LIBRARY_PATH
> > ###
> >
> >
> > (restarted both HADOOP and HBASE)
> > jps
> > 26324 HRegionServer
> > 26137 HMaster
> > 25567 JobHistoryServer
> > 25485 NodeManager
> > 25913 WebAppProxyServer
> > 24831 DataNode
> > 24712 NameNode
> > 27146 Jps
> > 9219 QuorumPeerMain
> > 25042 JournalNode
> > 25239 DFSZKFailoverController
> > 25358 ResourceManager
> >
> >
> > bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> file:///tmp/snappy-test snappy
> > 2014-08-27 12:24:08,030 INFO  [main] Configuration.deprecation:
> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> > SLF4J: Class path contains multiple SLF4J bindings.
> > SLF4J: Found binding in
> [jar:file:/mnt/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: Found binding in
> [jar:file:/mnt/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> > 2014-08-27 12:24:08,387 INFO  [main] util.ChecksumType: Checksum using
> org.apache.hadoop.util.PureJavaCrc32
> > 2014-08-27 12:24:08,388 INFO  [main] util.ChecksumType: Checksum can use
> org.apache.hadoop.util.PureJavaCrc32C
> > Exception in thread "main" java.lang.RuntimeException: native snappy
> library not available: this version of libhadoop was built without snappy
> support.
> >       at
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
> >       at
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
> >       at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
> >       at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
> >       at
> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
> >       at
> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
> >       at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
> >       at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
> >       at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
> >       at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
> >       at
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
> >       at
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
> >       at
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
> >
> >
> > bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> file:///tmp/snappy-test gz
> > 2014-08-27 12:35:34,485 INFO  [main] Configuration.deprecation:
> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> > SLF4J: Class path contains multiple SLF4J bindings.
> > SLF4J: Found binding in
> [jar:file:/mnt/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: Found binding in
> [jar:file:/mnt/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> > 2014-08-27 12:35:35,495 INFO  [main] util.ChecksumType: Checksum using
> org.apache.hadoop.util.PureJavaCrc32
> > 2014-08-27 12:35:35,495 INFO  [main] util.ChecksumType: Checksum can use
> org.apache.hadoop.util.PureJavaCrc32C
> > 2014-08-27 12:35:35,822 INFO  [main] zlib.ZlibFactory: Successfully
> loaded & initialized native-zlib library
> > 2014-08-27 12:35:35,851 INFO  [main] compress.CodecPool: Got brand-new
> compressor [.gz]
> > 2014-08-27 12:35:35,855 INFO  [main] compress.CodecPool: Got brand-new
> compressor [.gz]
> > 2014-08-27 12:35:35,866 ERROR [main] hbase.KeyValue: Unexpected
> getShortMidpointKey result, fakeKey:testkey, firstKeyInBlock:testkey
> > 2014-08-27 12:35:36,636 INFO  [main] compress.CodecPool: Got brand-new
> decompressor [.gz]
> > SUCCESS
> >
> >
> >
> >
> >
> > So still get the same issue,  I feel the issue should come from the
> hadoop compilation but no idea where would be wrong. Please help.
> >
> >
> > in my /etc/hadoop/core-site.xml, I have following related to snappy:
> >    <property>
> >     <name>io.compression.codecs</name>
> >     <value>
> >       org.apache.hadoop.io.compress.GzipCodec,
> >       org.apache.hadoop.io.compress.DefaultCodec,
> >       org.apache.hadoop.io.compress.BZip2Codec,
> >       org.apache.hadoop.io.compress.SnappyCodec
> >     </value>
> >    </property>
> >
> > in my mapred-site.xml, I have following related to snappy:
> >    <property>
> >     <name>mapred.output.compress</name>
> >     <value>false</value>
> >     <description>Should the job outputs be compressed?</description>
> >    </property>
> >    <property>
> >     <name>mapred.output.compression.type</name>
> >     <value>RECORD</value>
> >     <description>If the job outputs are to compressed as SequenceFiles,
> how should they be compressed? Should be one of NONE, RECORD or
> BLOCK.</description>
> >    </property>
> >    <property>
> >     <name>mapred.output.compression.codec</name>
> >     <value>org.apache.hadoop.io.compress.SnappyCodec</value>
> >     <description>If the job outputs are compressed, how should they be
> compressed?
> >     </description>
> >    </property>
> >    <property>
> >     <name>mapred.compress.map.output</name>
> >     <value>true</value>
> >     <description>Should the outputs of the maps be compressed before
> being sent across the network. Uses SequenceFile compression.</description>
> >    </property>
> >    <property>
> >     <name>mapred.map.output.compression.codec</name>
> >     <value>org.apache.hadoop.io.compress.SnappyCodec</value>
> >     <description>If the map outputs are compressed, how should they be
> compressed?</description>
> >   </property>
> >
> >   <property>
> >    <name>mapreduce.map.output.compress</name>
> >    <value>true</value>
> >   </property>
> >   <property>
> >    <name>mapred.map.output.compress.codec</name>
> >    <value>org.apache.hadoop.io.compress.SnappyCodec</value>
> >   </property>
> >
> >
> > I didn’t add any snappy related property to base-site.xml
> >
> >
> >
> > Regards
> > Arthur
> >
> >
> >
> >
> > On 27 Aug, 2014, at 8:07 am, Andrew Purtell <apurtell@apache.org> wrote:
> >
> >> On Tue, Aug 26, 2014 at 4:25 PM, Arthur.hk.chan@gmail.com <
> >> arthur.hk.chan@gmail.com> wrote:
> >>
> >>> Exception in thread "main" java.lang.RuntimeException: native snappy
> >>> library not available: this version of libhadoop was built without
> snappy
> >>> support.
> >>
> >> ​
> >> You are almost there. Unfortunately the native Hadoop libraries you
> copied
> >> into HBase's lib/native/Linux-amd64-64/ directory were
> >> ​apparently ​
> >> built without snappy support, as the exception indicates. You'll need to
> >> compile the native Hadoop libraries with snappy support enabled. Install
> >> snappy-revel as Alex mentioned and then build the Hadoop native
> libraries.
> >>
> >> 1. Get Hadoop sources for the Hadoop version
> >> 2. tar xvzf ....
> >> 3. cd /path/to/hadoop/src
> >> 4. mvn -DskipTests clean install
> >> ​ -Drequire.snappy=true​
> >> -Pnative
> >> 5. cp
> >>
> hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/libhadoop.*
> >> /path/to/hbase/lib/native/Linux-amd64-64
> >>
> >> ​(The -Drequire.snappy=true will fail the build if Snappy link libraries
> >> are not installed, so you can be sure of this.)​
> >>
> >>
> >> --
> >> Best regards,
> >>
> >>   - Andy
> >>
> >> Problems worthy of attack prove their worth by hitting back. - Piet Hein
> >> (via Tom White)
> >
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message