Return-Path: X-Original-To: apmail-hbase-user-archive@www.apache.org Delivered-To: apmail-hbase-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 031D9112C0 for ; Tue, 26 Aug 2014 23:45:50 +0000 (UTC) Received: (qmail 92768 invoked by uid 500); 26 Aug 2014 23:45:48 -0000 Delivered-To: apmail-hbase-user-archive@hbase.apache.org Received: (qmail 92702 invoked by uid 500); 26 Aug 2014 23:45:48 -0000 Mailing-List: contact user-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hbase.apache.org Delivered-To: mailing list user@hbase.apache.org Received: (qmail 92691 invoked by uid 99); 26 Aug 2014 23:45:47 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 26 Aug 2014 23:45:47 +0000 X-ASF-Spam-Status: No, hits=2.2 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_NEUTRAL X-Spam-Check-By: apache.org Received-SPF: neutral (athena.apache.org: local policy) Received: from [209.85.213.51] (HELO mail-yh0-f51.google.com) (209.85.213.51) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 26 Aug 2014 23:45:43 +0000 Received: by mail-yh0-f51.google.com with SMTP id f73so12564853yha.24 for ; Tue, 26 Aug 2014 16:45:22 -0700 (PDT) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:in-reply-to:references:from:date :message-id:subject:to:content-type; bh=mXg+uuXOogmof2dRD+cmy1HJjf8bHivKHHBYwBSCZ9I=; b=bq03wQ+fiqW4ZhHptaFgyrBATWqIgIUueT9jh5qZ6vkyRoeU5TDr3ozKO2LESdXgU1 Gvs+10UHrmEKvOBkMus27HvTaXXDYdZK3X3xWpW7eUr6WWcXgBRtaPta2OLvaXp5kree 3qE4WpUP1S0vZtu3/BMXliGnoNDKCYtjwObCtNYXsSw6lfzFt6BZVeHzlsgZHVJFDHVB EmvrKsM+70DHTsEFsoy7Ug3UV2shxvx/bz2KKUQjCcPpwG+ZShEjF0Bs0wkyB3hfKbCw wptiqWxw+sq4ReTj172RVuDJ/z/eR94cq8GzenVD9zUQsgqkHRzDZAx0qhIGLRXVA+eo HMvQ== X-Gm-Message-State: ALoCoQkBJHnkjEp3zRJR0n4zrmEP6SbWyZr3y4BkArqvph9UNkdBO5nrauhUkP1ATRJgBNm07OZl X-Received: by 10.220.105.201 with SMTP id u9mr26206471vco.11.1409096721925; Tue, 26 Aug 2014 16:45:21 -0700 (PDT) MIME-Version: 1.0 Received: by 10.52.113.1 with HTTP; Tue, 26 Aug 2014 16:45:01 -0700 (PDT) In-Reply-To: References: <8560AAB1-0982-4AAD-8189-4D3FCFED4AA6@gmail.com> <9A2C4A3D-F64C-4017-AB9F-C74F70121C47@gmail.com> <5F79FF83-75AF-4487-9CFA-FFBAA7B28C00@gmail.com> <2DBFD594-88DD-4C27-8A18-07F5C5CA213A@gmail.com> <18B5441A-7851-477A-AFD6-365E5EE51DCF@gmail.com> From: Jean-Marc Spaggiari Date: Tue, 26 Aug 2014 19:45:01 -0400 Message-ID: Subject: Re: Compilation error: HBASE 0.98.4 with Snappy To: user Content-Type: multipart/alternative; boundary=047d7b33dd1e90b3b7050190e5ba X-Virus-Checked: Checked by ClamAV on apache.org --047d7b33dd1e90b3b7050190e5ba Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Hi Arthur, What uname -m gives you? you need to check that to create the right folder under the lib directory. JM 2014-08-26 19:43 GMT-04:00 Alex Kamil : > Something like this worked for me > 1. get hbase binaries > 2. sudo yum install snappy snappy-devel > 3. ln -sf /usr/lib64/libsnappy.so > /var/lib/hadoop/lib/native/Linux-amd64-64/. > 4. ln -sf /usr/lib64/libsnappy.so > /var/lib/hbase/lib/native/Linux-amd64-64/. > 5. add snappy jar under $HADOOP_HOME/lib and $HBASE_HOME/lib > ref: https://issues.apache.org/jira/browse/PHOENIX-877 > > > On Tue, Aug 26, 2014 at 7:25 PM, Arthur.hk.chan@gmail.com < > arthur.hk.chan@gmail.com> wrote: > > > Hi, > > > > I just tried three more steps but was not able to get thru. > > > > > > 1) copied snappy files to $HBASE_HOME/lib > > $ cd $HBASE_HOME > > $ ll lib/*sna* > > -rw-r--r--. 1 hduser hadoop 11526 Aug 27 06:54 > > lib/hadoop-snappy-0.0.1-SNAPSHOT.jar > > -rw-rw-r--. 1 hduser hadoop 995968 Aug 3 18:43 > lib/snappy-java-1.0.4.1.jar > > > > ll lib/native/ > > drwxrwxr-x. 4 hduser hadoop 4096 Aug 27 06:54 Linux-amd64-64 > > > > ll lib/native/Linux-amd64-64/ > > total 18964 > > lrwxrwxrwx. 1 hduser Hadoop 24 Aug 27 07:08 libhadoopsnappy.so -> > > libhadoopsnappy.so.0.0.1 > > lrwxrwxrwx. 1 hduser Hadoop 24 Aug 27 07:08 libhadoopsnappy.so.0 -= > > > libhadoopsnappy.so.0.0.1 > > -rwxr-xr-x. 1 hduser Hadoop 54961 Aug 27 07:08 libhadoopsnappy.so.0.0= .1 > > lrwxrwxrwx. 1 hduser Hadoop 55 Aug 27 07:08 libjvm.so -> > > /usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so > > lrwxrwxrwx. 1 hduser Hadoop 25 Aug 27 07:08 libprotobuf-lite.so -> > > libprotobuf-lite.so.8.0.0 > > lrwxrwxrwx. 1 hduser Hadoop 25 Aug 27 07:08 libprotobuf-lite.so.8 = -> > > libprotobuf-lite.so.8.0.0 > > -rwxr-xr-x. 1 hduser Hadoop 964689 Aug 27 07:08 > libprotobuf-lite.so.8.0.0 > > lrwxrwxrwx. 1 hduser Hadoop 20 Aug 27 07:08 libprotobuf.so -> > > libprotobuf.so.8.0.0 > > lrwxrwxrwx. 1 hduser Hadoop 20 Aug 27 07:08 libprotobuf.so.8 -> > > libprotobuf.so.8.0.0 > > -rwxr-xr-x. 1 hduser Hadoop 8300050 Aug 27 07:08 libprotobuf.so.8.0.0 > > lrwxrwxrwx. 1 hduser Hadoop 18 Aug 27 07:08 libprotoc.so -> > > libprotoc.so.8.0.0 > > lrwxrwxrwx. 1 hduser Hadoop 18 Aug 27 07:08 libprotoc.so.8 -> > > libprotoc.so.8.0.0 > > -rwxr-xr-x. 1 hduser Hadoop 9935810 Aug 27 07:08 libprotoc.so.8.0.0 > > lrwxrwxrwx. 1 hduser Hadoop 18 Aug 27 07:08 libsnappy.so -> > > libsnappy.so.1.2.0 > > lrwxrwxrwx. 1 hduser Hadoop 18 Aug 27 07:08 libsnappy.so.1 -> > > libsnappy.so.1.2.0 > > -rwxr-xr-x. 1 hduser Hadoop 147726 Aug 27 07:08 libsnappy.so.1.2.0 > > drwxr-xr-x. 2 hduser Hadoop 4096 Aug 27 07:08 pkgconfig > > > > 2) $HBASE_HOME/conf/hbase-env.sh, added > > > > ### > > export > > > LD_LIBRARY_PATH=3D$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64= /:/usr/local/lib/ > > export > > > HBASE_LIBRARY_PATH=3D$HBASE_LIBRARY_PATH:$HBASE_HOME/lib/native/Linux-amd= 64-64/:/usr/local/lib/:$HBASE_HOME/lib/hadoop-snappy-0.0.1-SNAPSHOT.jar > > export CLASSPATH=3D$CLASSPATH:$HBASE_LIBRARY_PATH > > export HBASE_CLASSPATH=3D$HBASE_CLASSPATH:$HBASE_LIBRARY_PATH > > ### > > > > 3) restart HBASE and tried again > > $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest > > file:///tmp/snappy-test snappy > > 2014-08-27 07:16:09,490 INFO [main] Configuration.deprecation: > > hadoop.native.lib is deprecated. Instead, use io.native.lib.available > > SLF4J: Class path contains multiple SLF4J bindings. > > SLF4J: Found binding in > > > [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/o= rg/slf4j/impl/StaticLoggerBinder.class] > > SLF4J: Found binding in > > > [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-= 1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] > > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an > > explanation. > > 2014-08-27 07:16:10,323 INFO [main] util.ChecksumType: Checksum using > > org.apache.hadoop.util.PureJavaCrc32 > > 2014-08-27 07:16:10,324 INFO [main] util.ChecksumType: Checksum can us= e > > org.apache.hadoop.util.PureJavaCrc32C > > Exception in thread "main" java.lang.RuntimeException: native snappy > > library not available: this version of libhadoop was built without snap= py > > support. > > at > > > org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCod= ec.java:64) > > at > > > org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.j= ava:132) > > at > > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:14= 8) > > at > > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:16= 3) > > at > > > org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(C= ompression.java:310) > > at > > > org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.(HFileBlockDefaultEncodingContext.java:92) > > at > > > org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.(HFileBlock.java= :690) > > at > > > org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.j= ava:117) > > at > > > org.apache.hadoop.hbase.io.hfile.HFileWriterV2.(HFileWriterV2.java:= 109) > > at > > > org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWrit= er(HFileWriterV2.java:97) > > at > > > org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:39= 3) > > at > > > org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.= java:118) > > at > > > org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:14= 8) > > > > > > Regards > > Arthur > > > > > > > > On 27 Aug, 2014, at 6:27 am, Arthur.hk.chan@gmail.com < > > arthur.hk.chan@gmail.com> wrote: > > > > > Hi Sean, > > > > > > Thanks for your reply. > > > > > > I tried the following tests > > > > > > $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest > > file:///tmp/snappy-test gz > > > 2014-08-26 23:06:17,778 INFO [main] Configuration.deprecation: > > hadoop.native.lib is deprecated. Instead, use io.native.lib.available > > > SLF4J: Class path contains multiple SLF4J bindings. > > > SLF4J: Found binding in > > > [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/o= rg/slf4j/impl/StaticLoggerBinder.class] > > > SLF4J: Found binding in > > > [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-= 1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] > > > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an > > explanation. > > > 2014-08-26 23:06:18,103 INFO [main] util.ChecksumType: Checksum usin= g > > org.apache.hadoop.util.PureJavaCrc32 > > > 2014-08-26 23:06:18,104 INFO [main] util.ChecksumType: Checksum can > use > > org.apache.hadoop.util.PureJavaCrc32C > > > 2014-08-26 23:06:18,260 INFO [main] zlib.ZlibFactory: Successfully > > loaded & initialized native-zlib library > > > 2014-08-26 23:06:18,276 INFO [main] compress.CodecPool: Got brand-ne= w > > compressor [.gz] > > > 2014-08-26 23:06:18,280 INFO [main] compress.CodecPool: Got brand-ne= w > > compressor [.gz] > > > 2014-08-26 23:06:18,921 INFO [main] compress.CodecPool: Got brand-ne= w > > decompressor [.gz] > > > SUCCESS > > > > > > > > > $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest > > file:///tmp/snappy-test snappy > > > 2014-08-26 23:07:08,246 INFO [main] Configuration.deprecation: > > hadoop.native.lib is deprecated. Instead, use io.native.lib.available > > > SLF4J: Class path contains multiple SLF4J bindings. > > > SLF4J: Found binding in > > > [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/o= rg/slf4j/impl/StaticLoggerBinder.class] > > > SLF4J: Found binding in > > > [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-= 1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] > > > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an > > explanation. > > > 2014-08-26 23:07:08,578 INFO [main] util.ChecksumType: Checksum usin= g > > org.apache.hadoop.util.PureJavaCrc32 > > > 2014-08-26 23:07:08,579 INFO [main] util.ChecksumType: Checksum can > use > > org.apache.hadoop.util.PureJavaCrc32C > > > Exception in thread "main" java.lang.RuntimeException: native snappy > > library not available: this version of libhadoop was built without snap= py > > support. > > > at > > > org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCod= ec.java:64) > > > at > > > org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.j= ava:132) > > > at > > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:14= 8) > > > at > > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:16= 3) > > > at > > > org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(C= ompression.java:310) > > > at > > > org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.(HFileBlockDefaultEncodingContext.java:92) > > > at > > > org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.(HFileBlock.java= :690) > > > at > > > org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.j= ava:117) > > > at > > > org.apache.hadoop.hbase.io.hfile.HFileWriterV2.(HFileWriterV2.java:= 109) > > > at > > > org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWrit= er(HFileWriterV2.java:97) > > > at > > > org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:39= 3) > > > at > > > org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.= java:118) > > > at > > > org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:14= 8) > > > > > > > > > $ hbase shell > > > 2014-08-27 06:23:38,707 INFO [main] Configuration.deprecation: > > hadoop.native.lib is deprecated. Instead, use io.native.lib.available > > > HBase Shell; enter 'help' for list of supported commands. > > > Type "exit" to leave the HBase Shell > > > Version 0.98.4-hadoop2, rUnknown, Sun Aug 3 23:45:36 HKT 2014 > > > > > > hbase(main):001:0> > > > hbase(main):001:0> create 'tsnappy', { NAME =3D> 'f', COMPRESSION =3D= > > > 'snappy'} > > > SLF4J: Class path contains multiple SLF4J bindings. > > > SLF4J: Found binding in > > > [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/o= rg/slf4j/impl/StaticLoggerBinder.class] > > > SLF4J: Found binding in > > > [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-= 1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] > > > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an > > explanation. > > > > > > ERROR: java.io.IOException: Compression algorithm 'snappy' previously > > failed test. > > > at > > > org.apache.hadoop.hbase.util.CompressionTest.testCompression(CompressionT= est.java:85) > > > at > > > org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1764= ) > > > at > > > org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1757= ) > > > at > > org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1739) > > > at > > org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1774) > > > at > > > org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.c= allBlockingMethod(MasterProtos.java:40470) > > > at > org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2027) > > > at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:9= 8) > > > at > > > org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:= 74) > > > at > > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439) > > > at > > java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) > > > at java.util.concurrent.FutureTask.run(FutureTask.java:138) > > > at > > > java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor= .java:895) > > > at > > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.jav= a:918) > > > at java.lang.Thread.run(Thread.java:662) > > > > > > > > > > > > > > > Regards > > > Arthur > > > > > > > > > On 26 Aug, 2014, at 11:02 pm, Sean Busbey wrote= : > > > > > >> Hi Arthur! > > >> > > >> Our Snappy build instructions are currently out of date and I'm > working > > on updating them[1]. In short, I don't think there are any special buil= d > > steps for using snappy. > > >> > > >> I'm still working out what needs to be included in our instructions > for > > local and cluster testing. > > >> > > >> If you use the test for compression options, locally things will fai= l > > because the native hadoop libs won't be present: > > >> > > >> bin/hbase org.apache.hadoop.hbase.util.CompressionTest > > file:///tmp/snappy-test snappy > > >> (for comparison, replace "snappy" with "gz" and you will get a warni= ng > > about not having native libraries, but the test will succeed.) > > >> > > >> I believe JM's suggestion is for you to copy the Hadoop native > > libraries into the local HBase lib/native directory, which would allow > the > > local test to pass. If you are running in a deployed Hadoop cluster, I > > would expect the necessary libraries to already be available to HBase. > > >> > > >> [1]: https://issues.apache.org/jira/browse/HBASE-6189 > > >> > > >> -Sean > > >> > > >> > > >> On Tue, Aug 26, 2014 at 8:30 AM, Arthur.hk.chan@gmail.com < > > arthur.hk.chan@gmail.com> wrote: > > >> Hi JM > > >> > > >> Below are my commands, tried two cases under same source code folder= : > > >> a) compile with snappy parameters(failed), > > >> b) compile without snappy parameters (successful). > > >> > > >> Regards > > >> Arthur > > >> > > >> wget > > http://mirrors.devlib.org/apache/hbase/stable/hbase-0.98.4-src.tar.gz > > >> tar -vxf hbase-0.98.4-src.tar.gz > > >> mv hbase-0.98.4 hbase-0.98.4-src_snappy > > >> cd hbase-0.98.4-src_snappy > > >> nano dev-support/generate-hadoopX-poms.sh > > >> (change hbase_home=3D=E2=80=9C/usr/local/hadoop/hbase-0.98.4-src_= snappy=E2=80=9D) > > >> > > >> > > >> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4 0.98.4-hadoop2 > > >> a) with snappy parameters > > >> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single > > -Prelease,hadoop-snappy -Dhadoop-snappy.version=3D0.0.1-SNAPSHOT > > >> [INFO] > > -----------------------------------------------------------------------= - > > >> [INFO] Building HBase - Server 0.98.4-hadoop2 > > >> [INFO] > > -----------------------------------------------------------------------= - > > >> [WARNING] The POM for > > org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT is missing, no > > dependency information available > > >> [INFO] > > -----------------------------------------------------------------------= - > > >> [INFO] Reactor Summary: > > >> [INFO] > > >> [INFO] HBase ............................................. SUCCESS > > [8.192s] > > >> [INFO] HBase - Common .................................... SUCCESS > > [5.638s] > > >> [INFO] HBase - Protocol .................................. SUCCESS > > [1.535s] > > >> [INFO] HBase - Client .................................... SUCCESS > > [1.206s] > > >> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS > > [0.193s] > > >> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS > > [0.798s] > > >> [INFO] HBase - Prefix Tree ............................... SUCCESS > > [0.438s] > > >> [INFO] HBase - Server .................................... FAILURE > > [0.234s] > > >> [INFO] HBase - Testing Util .............................. SKIPPED > > >> [INFO] HBase - Thrift .................................... SKIPPED > > >> [INFO] HBase - Shell ..................................... SKIPPED > > >> [INFO] HBase - Integration Tests ......................... SKIPPED > > >> [INFO] HBase - Examples .................................. SKIPPED > > >> [INFO] HBase - Assembly .................................. SKIPPED > > >> [INFO] > > -----------------------------------------------------------------------= - > > >> [INFO] BUILD FAILURE > > >> [INFO] > > -----------------------------------------------------------------------= - > > >> [INFO] Total time: 19.474s > > >> [INFO] Finished at: Tue Aug 26 21:21:13 HKT 2014 > > >> [INFO] Final Memory: 51M/1100M > > >> [INFO] > > -----------------------------------------------------------------------= - > > >> [ERROR] Failed to execute goal on project hbase-server: Could not > > resolve dependencies for project > > org.apache.hbase:hbase-server:jar:0.98.4-hadoop2: Failure to find > > org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in > > http://maven.oschina.net/content/groups/public/ was cached in the local > > repository, resolution will not be reattempted until the update interva= l > of > > nexus-osc has elapsed or updates are forced -> [Help 1] > > >> [ERROR] > > >> [ERROR] To see the full stack trace of the errors, re-run Maven with > > the -e switch. > > >> [ERROR] Re-run Maven using the -X switch to enable full debug loggin= g. > > >> [ERROR] > > >> [ERROR] For more information about the errors and possible solutions= , > > please read the following articles: > > >> [ERROR] [Help 1] > > > http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionExce= ption > > >> [ERROR] > > >> [ERROR] After correcting the problems, you can resume the build with > > the command > > >> [ERROR] mvn -rf :hbase-server > > >> > > >> > > >> > > >> > > >> b) try again, without snappy parameters > > >> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single -Prelease > > >> [INFO] Building tar: > > > /edh/hadoop_all_sources/hbase-0.98.4-src_snappy/hbase-assembly/target/hba= se-0.98.4-hadoop2-bin.tar.gz > > >> [INFO] > > -----------------------------------------------------------------------= - > > >> [INFO] Reactor Summary: > > >> [INFO] > > >> [INFO] HBase ............................................. SUCCESS > > [3.290s] > > >> [INFO] HBase - Common .................................... SUCCESS > > [3.119s] > > >> [INFO] HBase - Protocol .................................. SUCCESS > > [0.972s] > > >> [INFO] HBase - Client .................................... SUCCESS > > [0.920s] > > >> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS > > [0.167s] > > >> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS > > [0.504s] > > >> [INFO] HBase - Prefix Tree ............................... SUCCESS > > [0.382s] > > >> [INFO] HBase - Server .................................... SUCCESS > > [4.790s] > > >> [INFO] HBase - Testing Util .............................. SUCCESS > > [0.598s] > > >> [INFO] HBase - Thrift .................................... SUCCESS > > [1.536s] > > >> [INFO] HBase - Shell ..................................... SUCCESS > > [0.369s] > > >> [INFO] HBase - Integration Tests ......................... SUCCESS > > [0.443s] > > >> [INFO] HBase - Examples .................................. SUCCESS > > [0.459s] > > >> [INFO] HBase - Assembly .................................. SUCCESS > > [13.240s] > > >> [INFO] > > -----------------------------------------------------------------------= - > > >> [INFO] BUILD SUCCESS > > >> [INFO] > > -----------------------------------------------------------------------= - > > >> [INFO] Total time: 31.408s > > >> [INFO] Finished at: Tue Aug 26 21:22:50 HKT 2014 > > >> [INFO] Final Memory: 57M/1627M > > >> [INFO] > > -----------------------------------------------------------------------= - > > >> > > >> > > >> > > >> > > >> > > >> On 26 Aug, 2014, at 8:52 pm, Jean-Marc Spaggiari < > > jean-marc@spaggiari.org> wrote: > > >> > > >> > Hi Arthur, > > >> > > > >> > How have you extracted HBase source and what command do you run to > > build? I > > >> > will do the same here locally so I can provide you the exact step = to > > >> > complete. > > >> > > > >> > JM > > >> > > > >> > > > >> > 2014-08-26 8:42 GMT-04:00 Arthur.hk.chan@gmail.com < > > arthur.hk.chan@gmail.com > > >> >> : > > >> > > > >> >> Hi JM > > >> >> > > >> >> Not too sure what you mean, do you mean I should create a new > folder > > in my > > >> >> HBASE_SRC named lib/native/Linux-x86 and copy these files to this > > folder > > >> >> then try to compile it again? > > >> >> > > >> >> Regards > > >> >> ARthur > > >> >> > > >> >> > > >> >> On 26 Aug, 2014, at 8:17 pm, Jean-Marc Spaggiari < > > jean-marc@spaggiari.org> > > >> >> wrote: > > >> >> > > >> >>> Hi Arthur, > > >> >>> > > >> >>> Almost done! You now need to copy them on the HBase folder. > > >> >>> > > >> >>> hbase@hbasetest1:~/hbase-0.98.2-hadoop2/lib$ tree | grep -v .jar > | > > grep > > >> >> -v > > >> >>> .rb > > >> >>> . > > >> >>> =E2=94=9C=E2=94=80=E2=94=80 native > > >> >>> =E2=94=82 =E2=94=94=E2=94=80=E2=94=80 Linux-x86 > > >> >>> =E2=94=82 =E2=94=9C=E2=94=80=E2=94=80 libsnappy.a > > >> >>> =E2=94=82 =E2=94=9C=E2=94=80=E2=94=80 libsnappy.la > > >> >>> =E2=94=82 =E2=94=9C=E2=94=80=E2=94=80 libsnappy.so > > >> >>> =E2=94=82 =E2=94=9C=E2=94=80=E2=94=80 libsnappy.so.1 > > >> >>> =E2=94=82 =E2=94=94=E2=94=80=E2=94=80 libsnappy.so.1.2.0 > > >> >>> > > >> >>> I don't have any hadoop-snappy lib in my hbase folder and it wor= ks > > very > > >> >>> well with Snappy for me... > > >> >>> > > >> >>> JM > > >> >>> > > >> >>> 2014-08-26 8:09 GMT-04:00 Arthur.hk.chan@gmail.com < > > >> >> arthur.hk.chan@gmail.com > > >> >>>> : > > >> >>> > > >> >>>> Hi JM, > > >> >>>> > > >> >>>> Below are my steps to install snappy lib, do I miss something? > > >> >>>> > > >> >>>> Regards > > >> >>>> Arthur > > >> >>>> > > >> >>>> wget https://snappy.googlecode.com/files/snappy-1.1.1.tar.gz > > >> >>>> tar -vxf snappy-1.1.1.tar.gz > > >> >>>> cd snappy-1.1.1 > > >> >>>> ./configure > > >> >>>> make > > >> >>>> make install > > >> >>>> make[1]: Entering directory > > >> >> `/edh/hadoop_all_sources/snappy-1.1.1' > > >> >>>> test -z "/usr/local/lib" || /bin/mkdir -p "/usr/local/lib= " > > >> >>>> /bin/sh ./libtool --mode=3Dinstall /usr/bin/install -c > > >> >>>> libsnappy.la '/usr/local/lib' > > >> >>>> libtool: install: /usr/bin/install -c > > .libs/libsnappy.so.1.2.0 > > >> >>>> /usr/local/lib/libsnappy.so.1.2.0 > > >> >>>> libtool: install: (cd /usr/local/lib && { ln -s -f > > >> >>>> libsnappy.so.1.2.0 libsnappy.so.1 || { rm -f libsnappy.so.1 && = ln > > -s > > >> >>>> libsnappy.so.1.2.0 libsnappy.so.1; }; }) > > >> >>>> libtool: install: (cd /usr/local/lib && { ln -s -f > > >> >>>> libsnappy.so.1.2.0 libsnappy.so || { rm -f libsnappy.so && ln -= s > > >> >>>> libsnappy.so.1.2.0 libsnappy.so; }; }) > > >> >>>> libtool: install: /usr/bin/install -c .libs/libsnappy.lai > > >> >>>> /usr/local/lib/libsnappy.la > > >> >>>> libtool: install: /usr/bin/install -c .libs/libsnappy.a > > >> >>>> /usr/local/lib/libsnappy.a > > >> >>>> libtool: install: chmod 644 /usr/local/lib/libsnappy.a > > >> >>>> libtool: install: ranlib /usr/local/lib/libsnappy.a > > >> >>>> libtool: finish: > > >> >>>> > > >> >> > > > PATH=3D"/edh/hadoop/spark/bin:/edh/hadoop/hbase/bin:/edh/hadoop/zookeeper= //bin:/edh/hadoop/yarn/hadoop/bin:/edh/hadoop/yarn/hadoop/sbin:/usr/lib64/q= t-3.3/bin:/opt/apache-maven-3.1.1/bin:/usr/local/sbin:/usr/local/bin:/sbin:= /bin:/usr/sbin:/usr/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/hive//bin:/u= sr/lib/jvm/jdk1.6.0_45//bin:/root/bin:/sbin" > > >> >>>> ldconfig -n /usr/local/lib > > >> >>>> > > >> >>>> > > ---------------------------------------------------------------------- > > >> >>>> Libraries have been installed in: > > >> >>>> /usr/local/lib > > >> >>>> If you ever happen to want to link against installed > > libraries > > >> >>>> in a given directory, LIBDIR, you must either use libtool= , > > and > > >> >>>> specify the full pathname of the library, or use the > > `-LLIBDIR' > > >> >>>> flag during linking and do at least one of the following: > > >> >>>> - add LIBDIR to the `LD_LIBRARY_PATH' environment variabl= e > > >> >>>> during execution > > >> >>>> - add LIBDIR to the `LD_RUN_PATH' environment variable > > >> >>>> during linking > > >> >>>> - use the `-Wl,-rpath -Wl,LIBDIR' linker flag > > >> >>>> - have your system administrator add LIBDIR to > > `/etc/ld.so.conf' > > >> >>>> See any operating system documentation about shared > > libraries for > > >> >>>> more information, such as the ld(1) and ld.so(8) manual > > pages. > > >> >>>> > > >> >>>> > > ---------------------------------------------------------------------- > > >> >>>> test -z "/usr/local/share/doc/snappy" || /bin/mkdir -p > > >> >>>> "/usr/local/share/doc/snappy" > > >> >>>> /usr/bin/install -c -m 644 ChangeLog COPYING INSTALL NEW= S > > README > > >> >>>> format_description.txt framing_format.txt > > '/usr/local/share/doc/snappy' > > >> >>>> test -z "/usr/local/include" || /bin/mkdir -p > > >> >> "/usr/local/include" > > >> >>>> /usr/bin/install -c -m 644 snappy.h snappy-sinksource.h > > >> >>>> snappy-stubs-public.h snappy-c.h '/usr/local/include' > > >> >>>> make[1]: Leaving directory > > `/edh/hadoop_all_sources/snappy-1.1.1' > > >> >>>> > > >> >>>> ll /usr/local/lib > > >> >>>> -rw-r--r--. 1 root root 233554 Aug 20 00:14 libsnappy.a > > >> >>>> -rwxr-xr-x. 1 root root 953 Aug 20 00:14 libsnappy.l= a > > >> >>>> lrwxrwxrwx. 1 root root 18 Aug 20 00:14 libsnappy.s= o > -> > > >> >>>> libsnappy.so.1.2.0 > > >> >>>> lrwxrwxrwx. 1 root root 18 Aug 20 00:14 > libsnappy.so.1 > > -> > > >> >>>> libsnappy.so.1.2.0 > > >> >>>> -rwxr-xr-x. 1 root root 147726 Aug 20 00:14 > > libsnappy.so.1.2.0 > > >> >>>> > > >> >>>> > > >> >>>> > > >> >>>> On 26 Aug, 2014, at 7:38 pm, Jean-Marc Spaggiari < > > >> >> jean-marc@spaggiari.org> > > >> >>>> wrote: > > >> >>>> > > >> >>>>> Hi Arthur, > > >> >>>>> > > >> >>>>> Do you have snappy libs installed and configured? HBase doesn'= t > > come > > >> >> with > > >> >>>>> Snappy. So yo need to have it first. > > >> >>>>> > > >> >>>>> Shameless plug: > > >> >>>>> > > >> >>>> > > >> >> > > > http://www.spaggiari.org/index.php/hbase/how-to-install-snappy-with-1#.U_= xxSqdZuZY > > >> >>>>> > > >> >>>>> This is for 0.96 but should be very similar for 0.98. I will t= ry > > it > > >> >> soon > > >> >>>>> and post and update, but keep us posted here so we can support > > you... > > >> >>>>> > > >> >>>>> JM > > >> >>>>> > > >> >>>>> > > >> >>>>> 2014-08-26 7:34 GMT-04:00 Arthur.hk.chan@gmail.com < > > >> >>>> arthur.hk.chan@gmail.com > > >> >>>>>> : > > >> >>>>> > > >> >>>>>> Hi, > > >> >>>>>> > > >> >>>>>> I need to install snappy to HBase 0.98.4. (my Hadoop version > is > > >> >> 2.4.1) > > >> >>>>>> > > >> >>>>>> Can you please advise what would be wrong? Should my pom.xml > be > > >> >>>> incorrect > > >> >>>>>> and missing something? > > >> >>>>>> > > >> >>>>>> Regards > > >> >>>>>> Arthur > > >> >>>>>> > > >> >>>>>> > > >> >>>>>> Below are my commands: > > >> >>>>>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4 > > 0.98.4-hadoop2 > > >> >>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single > > >> >>>>>> -Prelease,hadoop-snappy > > >> >>>>>> > > >> >>>>>> Iog: > > >> >>>>>> [INFO] > > >> >>>>>> > > >> >> > > -----------------------------------------------------------------------= - > > >> >>>>>> [INFO] Building HBase - Server 0.98.4-hadoop2 > > >> >>>>>> [INFO] > > >> >>>>>> > > >> >> > > -----------------------------------------------------------------------= - > > >> >>>>>> [WARNING] The POM for > > >> >> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT > > >> >>>>>> is missing, no dependency information available > > >> >>>>>> [INFO] > > >> >>>>>> > > >> >> > > -----------------------------------------------------------------------= - > > >> >>>>>> [INFO] Reactor Summary: > > >> >>>>>> [INFO] > > >> >>>>>> [INFO] HBase ............................................. > > SUCCESS > > >> >>>> [3.129s] > > >> >>>>>> [INFO] HBase - Common .................................... > > SUCCESS > > >> >>>> [3.105s] > > >> >>>>>> [INFO] HBase - Protocol .................................. > > SUCCESS > > >> >>>> [0.976s] > > >> >>>>>> [INFO] HBase - Client .................................... > > SUCCESS > > >> >>>> [0.925s] > > >> >>>>>> [INFO] HBase - Hadoop Compatibility ...................... > > SUCCESS > > >> >>>> [0.183s] > > >> >>>>>> [INFO] HBase - Hadoop Two Compatibility .................. > > SUCCESS > > >> >>>> [0.497s] > > >> >>>>>> [INFO] HBase - Prefix Tree ............................... > > SUCCESS > > >> >>>> [0.407s] > > >> >>>>>> [INFO] HBase - Server .................................... > > FAILURE > > >> >>>> [0.103s] > > >> >>>>>> [INFO] HBase - Testing Util .............................. > > SKIPPED > > >> >>>>>> [INFO] HBase - Thrift .................................... > > SKIPPED > > >> >>>>>> [INFO] HBase - Shell ..................................... > > SKIPPED > > >> >>>>>> [INFO] HBase - Integration Tests ......................... > > SKIPPED > > >> >>>>>> [INFO] HBase - Examples .................................. > > SKIPPED > > >> >>>>>> [INFO] HBase - Assembly .................................. > > SKIPPED > > >> >>>>>> [INFO] > > >> >>>>>> > > >> >> > > -----------------------------------------------------------------------= - > > >> >>>>>> [INFO] BUILD FAILURE > > >> >>>>>> [INFO] > > >> >>>>>> > > >> >> > > -----------------------------------------------------------------------= - > > >> >>>>>> [INFO] Total time: 9.939s > > >> >>>>>> [INFO] Finished at: Tue Aug 26 19:23:14 HKT 2014 > > >> >>>>>> [INFO] Final Memory: 61M/2921M > > >> >>>>>> [INFO] > > >> >>>>>> > > >> >> > > -----------------------------------------------------------------------= - > > >> >>>>>> [ERROR] Failed to execute goal on project hbase-server: Could > not > > >> >>>> resolve > > >> >>>>>> dependencies for project > > >> >>>> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2: > > >> >>>>>> Failure to find > > org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in > > >> >>>>>> http://maven.oschina.net/content/groups/public/ was cached in > > the > > >> >> local > > >> >>>>>> repository, resolution will not be reattempted until the upda= te > > >> >>>> interval of > > >> >>>>>> nexus-osc has elapsed or updates are forced -> [Help 1] > > >> >>>>>> [ERROR] > > >> >>>>>> [ERROR] To see the full stack trace of the errors, re-run Mav= en > > with > > >> >> the > > >> >>>>>> -e switch. > > >> >>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug > > logging. > > >> >>>>>> [ERROR] > > >> >>>>>> [ERROR] For more information about the errors and possible > > solutions, > > >> >>>>>> please read the following articles: > > >> >>>>>> [ERROR] [Help 1] > > >> >>>>>> > > >> >>>> > > >> >> > > > http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionExce= ption > > >> >>>>>> [ERROR] > > >> >>>>>> [ERROR] After correcting the problems, you can resume the bui= ld > > with > > >> >> the > > >> >>>>>> command > > >> >>>>>> [ERROR] mvn -rf :hbase-server > > >> >>>>>> > > >> >>>>>> > > >> >>>> > > >> >>>> > > >> >> > > >> >> > > >> > > >> > > >> > > >> > > >> -- > > >> Sean > > > > > > > > --047d7b33dd1e90b3b7050190e5ba--