Return-Path: X-Original-To: apmail-hbase-user-archive@www.apache.org Delivered-To: apmail-hbase-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 62DDA1120D for ; Tue, 26 Aug 2014 23:25:34 +0000 (UTC) Received: (qmail 44068 invoked by uid 500); 26 Aug 2014 23:25:32 -0000 Delivered-To: apmail-hbase-user-archive@hbase.apache.org Received: (qmail 44001 invoked by uid 500); 26 Aug 2014 23:25:32 -0000 Mailing-List: contact user-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hbase.apache.org Delivered-To: mailing list user@hbase.apache.org Received: (qmail 43975 invoked by uid 99); 26 Aug 2014 23:25:32 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 26 Aug 2014 23:25:32 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of arthur.hk.chan@gmail.com designates 209.85.220.50 as permitted sender) Received: from [209.85.220.50] (HELO mail-pa0-f50.google.com) (209.85.220.50) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 26 Aug 2014 23:25:27 +0000 Received: by mail-pa0-f50.google.com with SMTP id et14so24364203pad.23 for ; Tue, 26 Aug 2014 16:25:07 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=content-type:mime-version:subject:from:in-reply-to:date:cc :message-id:references:to; bh=K6UK2o8su03cHIEjashTo1c9+67PGYJ3pQ67gEdrSgU=; b=nqTeeXFJX6iDKAikcn6SKG5rCGMqxESXWj3X24jVYZO5kkN4e2TiaNgmBfZwZYDlaK smgrrSLNF1UCZFc2/TgoFpD36gecP6FXiyNhCQ0btSZw8EGaAzIpDnLPjGdm950Mk3E/ 3Ueqes8s2UuV43jCdsGhwq+Piv26VxvtSpDtP5ID3Cpl68hCPqjP1sYJYiuM5kHiJyJ3 diSfeh+xvIZh87/RcbAbILQIJ5Zl57B+Yv90iGVivG9WMdm3R1hxKUlDC+Ro6Axe18IJ djZ+KeUpGhhDKRewy+XqpMMzUmfirK35j14nRFObavtiiukfrPvcQpkETl4uvVWFe5MJ 4erg== X-Received: by 10.70.90.3 with SMTP id bs3mr40902599pdb.42.1409095507158; Tue, 26 Aug 2014 16:25:07 -0700 (PDT) Received: from [192.168.0.105] (014136185055.ctinets.com. [14.136.185.55]) by mx.google.com with ESMTPSA id y5sm4385516pbt.64.2014.08.26.16.25.04 for (version=TLSv1 cipher=ECDHE-RSA-RC4-SHA bits=128/128); Tue, 26 Aug 2014 16:25:06 -0700 (PDT) Content-Type: multipart/alternative; boundary="Apple-Mail=_85136165-E073-41BF-A763-205A66082BAF" Mime-Version: 1.0 (Mac OS X Mail 7.3 \(1878.6\)) Subject: Re: Compilation error: HBASE 0.98.4 with Snappy From: "Arthur.hk.chan@gmail.com" In-Reply-To: <2DBFD594-88DD-4C27-8A18-07F5C5CA213A@gmail.com> Date: Wed, 27 Aug 2014 07:25:01 +0800 Cc: "Arthur.hk.chan@gmail.com" , user Message-Id: <18B5441A-7851-477A-AFD6-365E5EE51DCF@gmail.com> References: <8560AAB1-0982-4AAD-8189-4D3FCFED4AA6@gmail.com> <9A2C4A3D-F64C-4017-AB9F-C74F70121C47@gmail.com> <5F79FF83-75AF-4487-9CFA-FFBAA7B28C00@gmail.com> <2DBFD594-88DD-4C27-8A18-07F5C5CA213A@gmail.com> To: Sean Busbey X-Mailer: Apple Mail (2.1878.6) X-Virus-Checked: Checked by ClamAV on apache.org --Apple-Mail=_85136165-E073-41BF-A763-205A66082BAF Content-Transfer-Encoding: quoted-printable Content-Type: text/plain; charset=utf-8 Hi, I just tried three more steps but was not able to get thru. 1) copied snappy files to $HBASE_HOME/lib $ cd $HBASE_HOME $ ll lib/*sna* -rw-r--r--. 1 hduser hadoop 11526 Aug 27 06:54 = lib/hadoop-snappy-0.0.1-SNAPSHOT.jar -rw-rw-r--. 1 hduser hadoop 995968 Aug 3 18:43 = lib/snappy-java-1.0.4.1.jar ll lib/native/ drwxrwxr-x. 4 hduser hadoop 4096 Aug 27 06:54 Linux-amd64-64 ll lib/native/Linux-amd64-64/ total 18964 lrwxrwxrwx. 1 hduser Hadoop 24 Aug 27 07:08 libhadoopsnappy.so -> = libhadoopsnappy.so.0.0.1 lrwxrwxrwx. 1 hduser Hadoop 24 Aug 27 07:08 libhadoopsnappy.so.0 -> = libhadoopsnappy.so.0.0.1 -rwxr-xr-x. 1 hduser Hadoop 54961 Aug 27 07:08 = libhadoopsnappy.so.0.0.1 lrwxrwxrwx. 1 hduser Hadoop 55 Aug 27 07:08 libjvm.so -> = /usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so lrwxrwxrwx. 1 hduser Hadoop 25 Aug 27 07:08 libprotobuf-lite.so -> = libprotobuf-lite.so.8.0.0 lrwxrwxrwx. 1 hduser Hadoop 25 Aug 27 07:08 libprotobuf-lite.so.8 = -> libprotobuf-lite.so.8.0.0 -rwxr-xr-x. 1 hduser Hadoop 964689 Aug 27 07:08 = libprotobuf-lite.so.8.0.0 lrwxrwxrwx. 1 hduser Hadoop 20 Aug 27 07:08 libprotobuf.so -> = libprotobuf.so.8.0.0 lrwxrwxrwx. 1 hduser Hadoop 20 Aug 27 07:08 libprotobuf.so.8 -> = libprotobuf.so.8.0.0 -rwxr-xr-x. 1 hduser Hadoop 8300050 Aug 27 07:08 libprotobuf.so.8.0.0 lrwxrwxrwx. 1 hduser Hadoop 18 Aug 27 07:08 libprotoc.so -> = libprotoc.so.8.0.0 lrwxrwxrwx. 1 hduser Hadoop 18 Aug 27 07:08 libprotoc.so.8 -> = libprotoc.so.8.0.0 -rwxr-xr-x. 1 hduser Hadoop 9935810 Aug 27 07:08 libprotoc.so.8.0.0 lrwxrwxrwx. 1 hduser Hadoop 18 Aug 27 07:08 libsnappy.so -> = libsnappy.so.1.2.0 lrwxrwxrwx. 1 hduser Hadoop 18 Aug 27 07:08 libsnappy.so.1 -> = libsnappy.so.1.2.0 -rwxr-xr-x. 1 hduser Hadoop 147726 Aug 27 07:08 libsnappy.so.1.2.0 drwxr-xr-x. 2 hduser Hadoop 4096 Aug 27 07:08 pkgconfig =20 2) $HBASE_HOME/conf/hbase-env.sh, added ###=20 export = LD_LIBRARY_PATH=3D$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64/= :/usr/local/lib/ export = HBASE_LIBRARY_PATH=3D$HBASE_LIBRARY_PATH:$HBASE_HOME/lib/native/Linux-amd6= 4-64/:/usr/local/lib/:$HBASE_HOME/lib/hadoop-snappy-0.0.1-SNAPSHOT.jar export CLASSPATH=3D$CLASSPATH:$HBASE_LIBRARY_PATH export HBASE_CLASSPATH=3D$HBASE_CLASSPATH:$HBASE_LIBRARY_PATH ### 3) restart HBASE and tried again $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest = file:///tmp/snappy-test snappy 2014-08-27 07:16:09,490 INFO [main] Configuration.deprecation: = hadoop.native.lib is deprecated. Instead, use io.native.lib.available SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in = [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/or= g/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in = [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1= .7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an = explanation. 2014-08-27 07:16:10,323 INFO [main] util.ChecksumType: Checksum using = org.apache.hadoop.util.PureJavaCrc32 2014-08-27 07:16:10,324 INFO [main] util.ChecksumType: Checksum can use = org.apache.hadoop.util.PureJavaCrc32C Exception in thread "main" java.lang.RuntimeException: native snappy = library not available: this version of libhadoop was built without = snappy support. at = org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCode= c.java:64) at = org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.ja= va:132) at = org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148) at = org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163) at = org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Co= mpression.java:310) at = org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.(HFileBlockDefaultEncodingContext.java:92) at = org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.(HFileBlock.java:= 690) at = org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.ja= va:117) at = org.apache.hadoop.hbase.io.hfile.HFileWriterV2.(HFileWriterV2.java:1= 09) at = org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWrite= r(HFileWriterV2.java:97) at = org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393= ) at = org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.j= ava:118) at = org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148= ) Regards Arthur On 27 Aug, 2014, at 6:27 am, Arthur.hk.chan@gmail.com = wrote: > Hi Sean, >=20 > Thanks for your reply. >=20 > I tried the following tests >=20 > $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest = file:///tmp/snappy-test gz > 2014-08-26 23:06:17,778 INFO [main] Configuration.deprecation: = hadoop.native.lib is deprecated. Instead, use io.native.lib.available > SLF4J: Class path contains multiple SLF4J bindings. > SLF4J: Found binding in = [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/or= g/slf4j/impl/StaticLoggerBinder.class] > SLF4J: Found binding in = [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1= .7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an = explanation. > 2014-08-26 23:06:18,103 INFO [main] util.ChecksumType: Checksum using = org.apache.hadoop.util.PureJavaCrc32 > 2014-08-26 23:06:18,104 INFO [main] util.ChecksumType: Checksum can = use org.apache.hadoop.util.PureJavaCrc32C > 2014-08-26 23:06:18,260 INFO [main] zlib.ZlibFactory: Successfully = loaded & initialized native-zlib library > 2014-08-26 23:06:18,276 INFO [main] compress.CodecPool: Got brand-new = compressor [.gz] > 2014-08-26 23:06:18,280 INFO [main] compress.CodecPool: Got brand-new = compressor [.gz] > 2014-08-26 23:06:18,921 INFO [main] compress.CodecPool: Got brand-new = decompressor [.gz] > SUCCESS >=20 >=20 > $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest = file:///tmp/snappy-test snappy > 2014-08-26 23:07:08,246 INFO [main] Configuration.deprecation: = hadoop.native.lib is deprecated. Instead, use io.native.lib.available > SLF4J: Class path contains multiple SLF4J bindings. > SLF4J: Found binding in = [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/or= g/slf4j/impl/StaticLoggerBinder.class] > SLF4J: Found binding in = [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1= .7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an = explanation. > 2014-08-26 23:07:08,578 INFO [main] util.ChecksumType: Checksum using = org.apache.hadoop.util.PureJavaCrc32 > 2014-08-26 23:07:08,579 INFO [main] util.ChecksumType: Checksum can = use org.apache.hadoop.util.PureJavaCrc32C > Exception in thread "main" java.lang.RuntimeException: native snappy = library not available: this version of libhadoop was built without = snappy support. > at = org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCode= c.java:64) > at = org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.ja= va:132) > at = org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148) > at = org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163) > at = org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Co= mpression.java:310) > at = org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.(HFileBlockDefaultEncodingContext.java:92) > at = org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.(HFileBlock.java:= 690) > at = org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.ja= va:117) > at = org.apache.hadoop.hbase.io.hfile.HFileWriterV2.(HFileWriterV2.java:1= 09) > at = org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWrite= r(HFileWriterV2.java:97) > at = org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393= ) > at = org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.j= ava:118) > at = org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148= ) >=20 >=20 > $ hbase shell > 2014-08-27 06:23:38,707 INFO [main] Configuration.deprecation: = hadoop.native.lib is deprecated. Instead, use io.native.lib.available > HBase Shell; enter 'help' for list of supported commands. > Type "exit" to leave the HBase Shell > Version 0.98.4-hadoop2, rUnknown, Sun Aug 3 23:45:36 HKT 2014 >=20 > hbase(main):001:0>=20 > hbase(main):001:0> create 'tsnappy', { NAME =3D> 'f', COMPRESSION =3D> = 'snappy'} > SLF4J: Class path contains multiple SLF4J bindings. > SLF4J: Found binding in = [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/or= g/slf4j/impl/StaticLoggerBinder.class] > SLF4J: Found binding in = [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1= .7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an = explanation. >=20 > ERROR: java.io.IOException: Compression algorithm 'snappy' previously = failed test. > at = org.apache.hadoop.hbase.util.CompressionTest.testCompression(CompressionTe= st.java:85) > at = org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1764)= > at = org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1757)= > at = org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1739) > at = org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1774) > at = org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.ca= llBlockingMethod(MasterProtos.java:40470) > at = org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2027) > at = org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:98) > at = org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:7= 4) > at = java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439) > at = java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) > at java.util.concurrent.FutureTask.run(FutureTask.java:138) > at = java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.= java:895) > at = java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java= :918) > at java.lang.Thread.run(Thread.java:662) >=20 >=20 >=20 >=20 > Regards > Arthur >=20 >=20 > On 26 Aug, 2014, at 11:02 pm, Sean Busbey wrote: >=20 >> Hi Arthur! >>=20 >> Our Snappy build instructions are currently out of date and I'm = working on updating them[1]. In short, I don't think there are any = special build steps for using snappy. >>=20 >> I'm still working out what needs to be included in our instructions = for local and cluster testing. >>=20 >> If you use the test for compression options, locally things will fail = because the native hadoop libs won't be present: >>=20 >> bin/hbase org.apache.hadoop.hbase.util.CompressionTest = file:///tmp/snappy-test snappy=20 >> (for comparison, replace "snappy" with "gz" and you will get a = warning about not having native libraries, but the test will succeed.) >>=20 >> I believe JM's suggestion is for you to copy the Hadoop native = libraries into the local HBase lib/native directory, which would allow = the local test to pass. If you are running in a deployed Hadoop cluster, = I would expect the necessary libraries to already be available to HBase. >>=20 >> [1]: https://issues.apache.org/jira/browse/HBASE-6189 >>=20 >> -Sean >>=20 >>=20 >> On Tue, Aug 26, 2014 at 8:30 AM, Arthur.hk.chan@gmail.com = wrote: >> Hi JM >>=20 >> Below are my commands, tried two cases under same source code folder: >> a) compile with snappy parameters(failed), >> b) compile without snappy parameters (successful). >>=20 >> Regards >> Arthur >>=20 >> wget = http://mirrors.devlib.org/apache/hbase/stable/hbase-0.98.4-src.tar.gz >> tar -vxf hbase-0.98.4-src.tar.gz >> mv hbase-0.98.4 hbase-0.98.4-src_snappy >> cd hbase-0.98.4-src_snappy >> nano dev-support/generate-hadoopX-poms.sh >> (change = hbase_home=3D=E2=80=9C/usr/local/hadoop/hbase-0.98.4-src_snappy=E2=80=9D) >>=20 >>=20 >> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4 0.98.4-hadoop2 >> a) with snappy parameters >> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single = -Prelease,hadoop-snappy -Dhadoop-snappy.version=3D0.0.1-SNAPSHOT >> [INFO] = ------------------------------------------------------------------------ >> [INFO] Building HBase - Server 0.98.4-hadoop2 >> [INFO] = ------------------------------------------------------------------------ >> [WARNING] The POM for = org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT is missing, no = dependency information available >> [INFO] = ------------------------------------------------------------------------ >> [INFO] Reactor Summary: >> [INFO] >> [INFO] HBase ............................................. SUCCESS = [8.192s] >> [INFO] HBase - Common .................................... SUCCESS = [5.638s] >> [INFO] HBase - Protocol .................................. SUCCESS = [1.535s] >> [INFO] HBase - Client .................................... SUCCESS = [1.206s] >> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS = [0.193s] >> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS = [0.798s] >> [INFO] HBase - Prefix Tree ............................... SUCCESS = [0.438s] >> [INFO] HBase - Server .................................... FAILURE = [0.234s] >> [INFO] HBase - Testing Util .............................. SKIPPED >> [INFO] HBase - Thrift .................................... SKIPPED >> [INFO] HBase - Shell ..................................... SKIPPED >> [INFO] HBase - Integration Tests ......................... SKIPPED >> [INFO] HBase - Examples .................................. SKIPPED >> [INFO] HBase - Assembly .................................. SKIPPED >> [INFO] = ------------------------------------------------------------------------ >> [INFO] BUILD FAILURE >> [INFO] = ------------------------------------------------------------------------ >> [INFO] Total time: 19.474s >> [INFO] Finished at: Tue Aug 26 21:21:13 HKT 2014 >> [INFO] Final Memory: 51M/1100M >> [INFO] = ------------------------------------------------------------------------ >> [ERROR] Failed to execute goal on project hbase-server: Could not = resolve dependencies for project = org.apache.hbase:hbase-server:jar:0.98.4-hadoop2: Failure to find = org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in = http://maven.oschina.net/content/groups/public/ was cached in the local = repository, resolution will not be reattempted until the update interval = of nexus-osc has elapsed or updates are forced -> [Help 1] >> [ERROR] >> [ERROR] To see the full stack trace of the errors, re-run Maven with = the -e switch. >> [ERROR] Re-run Maven using the -X switch to enable full debug = logging. >> [ERROR] >> [ERROR] For more information about the errors and possible solutions, = please read the following articles: >> [ERROR] [Help 1] = http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionExcep= tion >> [ERROR] >> [ERROR] After correcting the problems, you can resume the build with = the command >> [ERROR] mvn -rf :hbase-server >>=20 >>=20 >>=20 >>=20 >> b) try again, without snappy parameters >> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single -Prelease >> [INFO] Building tar: = /edh/hadoop_all_sources/hbase-0.98.4-src_snappy/hbase-assembly/target/hbas= e-0.98.4-hadoop2-bin.tar.gz >> [INFO] = ------------------------------------------------------------------------ >> [INFO] Reactor Summary: >> [INFO] >> [INFO] HBase ............................................. SUCCESS = [3.290s] >> [INFO] HBase - Common .................................... SUCCESS = [3.119s] >> [INFO] HBase - Protocol .................................. SUCCESS = [0.972s] >> [INFO] HBase - Client .................................... SUCCESS = [0.920s] >> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS = [0.167s] >> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS = [0.504s] >> [INFO] HBase - Prefix Tree ............................... SUCCESS = [0.382s] >> [INFO] HBase - Server .................................... SUCCESS = [4.790s] >> [INFO] HBase - Testing Util .............................. SUCCESS = [0.598s] >> [INFO] HBase - Thrift .................................... SUCCESS = [1.536s] >> [INFO] HBase - Shell ..................................... SUCCESS = [0.369s] >> [INFO] HBase - Integration Tests ......................... SUCCESS = [0.443s] >> [INFO] HBase - Examples .................................. SUCCESS = [0.459s] >> [INFO] HBase - Assembly .................................. SUCCESS = [13.240s] >> [INFO] = ------------------------------------------------------------------------ >> [INFO] BUILD SUCCESS >> [INFO] = ------------------------------------------------------------------------ >> [INFO] Total time: 31.408s >> [INFO] Finished at: Tue Aug 26 21:22:50 HKT 2014 >> [INFO] Final Memory: 57M/1627M >> [INFO] = ------------------------------------------------------------------------ >>=20 >>=20 >>=20 >>=20 >>=20 >> On 26 Aug, 2014, at 8:52 pm, Jean-Marc Spaggiari = wrote: >>=20 >> > Hi Arthur, >> > >> > How have you extracted HBase source and what command do you run to = build? I >> > will do the same here locally so I can provide you the exact step = to >> > complete. >> > >> > JM >> > >> > >> > 2014-08-26 8:42 GMT-04:00 Arthur.hk.chan@gmail.com = > >> : >> > >> >> Hi JM >> >> >> >> Not too sure what you mean, do you mean I should create a new = folder in my >> >> HBASE_SRC named lib/native/Linux-x86 and copy these files to this = folder >> >> then try to compile it again? >> >> >> >> Regards >> >> ARthur >> >> >> >> >> >> On 26 Aug, 2014, at 8:17 pm, Jean-Marc Spaggiari = >> >> wrote: >> >> >> >>> Hi Arthur, >> >>> >> >>> Almost done! You now need to copy them on the HBase folder. >> >>> >> >>> hbase@hbasetest1:~/hbase-0.98.2-hadoop2/lib$ tree | grep -v .jar = | grep >> >> -v >> >>> .rb >> >>> . >> >>> =E2=94=9C=E2=94=80=E2=94=80 native >> >>> =E2=94=82 =E2=94=94=E2=94=80=E2=94=80 Linux-x86 >> >>> =E2=94=82 =E2=94=9C=E2=94=80=E2=94=80 libsnappy.a >> >>> =E2=94=82 =E2=94=9C=E2=94=80=E2=94=80 libsnappy.la >> >>> =E2=94=82 =E2=94=9C=E2=94=80=E2=94=80 libsnappy.so >> >>> =E2=94=82 =E2=94=9C=E2=94=80=E2=94=80 libsnappy.so.1 >> >>> =E2=94=82 =E2=94=94=E2=94=80=E2=94=80 libsnappy.so.1.2.0 >> >>> >> >>> I don't have any hadoop-snappy lib in my hbase folder and it = works very >> >>> well with Snappy for me... >> >>> >> >>> JM >> >>> >> >>> 2014-08-26 8:09 GMT-04:00 Arthur.hk.chan@gmail.com < >> >> arthur.hk.chan@gmail.com >> >>>> : >> >>> >> >>>> Hi JM, >> >>>> >> >>>> Below are my steps to install snappy lib, do I miss something? >> >>>> >> >>>> Regards >> >>>> Arthur >> >>>> >> >>>> wget https://snappy.googlecode.com/files/snappy-1.1.1.tar.gz >> >>>> tar -vxf snappy-1.1.1.tar.gz >> >>>> cd snappy-1.1.1 >> >>>> ./configure >> >>>> make >> >>>> make install >> >>>> make[1]: Entering directory >> >> `/edh/hadoop_all_sources/snappy-1.1.1' >> >>>> test -z "/usr/local/lib" || /bin/mkdir -p "/usr/local/lib" >> >>>> /bin/sh ./libtool --mode=3Dinstall /usr/bin/install -c >> >>>> libsnappy.la '/usr/local/lib' >> >>>> libtool: install: /usr/bin/install -c = .libs/libsnappy.so.1.2.0 >> >>>> /usr/local/lib/libsnappy.so.1.2.0 >> >>>> libtool: install: (cd /usr/local/lib && { ln -s -f >> >>>> libsnappy.so.1.2.0 libsnappy.so.1 || { rm -f libsnappy.so.1 && = ln -s >> >>>> libsnappy.so.1.2.0 libsnappy.so.1; }; }) >> >>>> libtool: install: (cd /usr/local/lib && { ln -s -f >> >>>> libsnappy.so.1.2.0 libsnappy.so || { rm -f libsnappy.so && ln -s >> >>>> libsnappy.so.1.2.0 libsnappy.so; }; }) >> >>>> libtool: install: /usr/bin/install -c .libs/libsnappy.lai >> >>>> /usr/local/lib/libsnappy.la >> >>>> libtool: install: /usr/bin/install -c .libs/libsnappy.a >> >>>> /usr/local/lib/libsnappy.a >> >>>> libtool: install: chmod 644 /usr/local/lib/libsnappy.a >> >>>> libtool: install: ranlib /usr/local/lib/libsnappy.a >> >>>> libtool: finish: >> >>>> >> >> = PATH=3D"/edh/hadoop/spark/bin:/edh/hadoop/hbase/bin:/edh/hadoop/zookeeper/= /bin:/edh/hadoop/yarn/hadoop/bin:/edh/hadoop/yarn/hadoop/sbin:/usr/lib64/q= t-3.3/bin:/opt/apache-maven-3.1.1/bin:/usr/local/sbin:/usr/local/bin:/sbin= :/bin:/usr/sbin:/usr/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/hive//bin:= /usr/lib/jvm/jdk1.6.0_45//bin:/root/bin:/sbin" >> >>>> ldconfig -n /usr/local/lib >> >>>> >> >>>> = ---------------------------------------------------------------------- >> >>>> Libraries have been installed in: >> >>>> /usr/local/lib >> >>>> If you ever happen to want to link against installed = libraries >> >>>> in a given directory, LIBDIR, you must either use libtool, = and >> >>>> specify the full pathname of the library, or use the = `-LLIBDIR' >> >>>> flag during linking and do at least one of the following: >> >>>> - add LIBDIR to the `LD_LIBRARY_PATH' environment variable >> >>>> during execution >> >>>> - add LIBDIR to the `LD_RUN_PATH' environment variable >> >>>> during linking >> >>>> - use the `-Wl,-rpath -Wl,LIBDIR' linker flag >> >>>> - have your system administrator add LIBDIR to = `/etc/ld.so.conf' >> >>>> See any operating system documentation about shared = libraries for >> >>>> more information, such as the ld(1) and ld.so(8) manual = pages. >> >>>> >> >>>> = ---------------------------------------------------------------------- >> >>>> test -z "/usr/local/share/doc/snappy" || /bin/mkdir -p >> >>>> "/usr/local/share/doc/snappy" >> >>>> /usr/bin/install -c -m 644 ChangeLog COPYING INSTALL NEWS = README >> >>>> format_description.txt framing_format.txt = '/usr/local/share/doc/snappy' >> >>>> test -z "/usr/local/include" || /bin/mkdir -p >> >> "/usr/local/include" >> >>>> /usr/bin/install -c -m 644 snappy.h snappy-sinksource.h >> >>>> snappy-stubs-public.h snappy-c.h '/usr/local/include' >> >>>> make[1]: Leaving directory = `/edh/hadoop_all_sources/snappy-1.1.1' >> >>>> >> >>>> ll /usr/local/lib >> >>>> -rw-r--r--. 1 root root 233554 Aug 20 00:14 libsnappy.a >> >>>> -rwxr-xr-x. 1 root root 953 Aug 20 00:14 libsnappy.la >> >>>> lrwxrwxrwx. 1 root root 18 Aug 20 00:14 libsnappy.so = -> >> >>>> libsnappy.so.1.2.0 >> >>>> lrwxrwxrwx. 1 root root 18 Aug 20 00:14 = libsnappy.so.1 -> >> >>>> libsnappy.so.1.2.0 >> >>>> -rwxr-xr-x. 1 root root 147726 Aug 20 00:14 = libsnappy.so.1.2.0 >> >>>> >> >>>> >> >>>> >> >>>> On 26 Aug, 2014, at 7:38 pm, Jean-Marc Spaggiari < >> >> jean-marc@spaggiari.org> >> >>>> wrote: >> >>>> >> >>>>> Hi Arthur, >> >>>>> >> >>>>> Do you have snappy libs installed and configured? HBase doesn't = come >> >> with >> >>>>> Snappy. So yo need to have it first. >> >>>>> >> >>>>> Shameless plug: >> >>>>> >> >>>> >> >> = http://www.spaggiari.org/index.php/hbase/how-to-install-snappy-with-1#.U_x= xSqdZuZY >> >>>>> >> >>>>> This is for 0.96 but should be very similar for 0.98. I will = try it >> >> soon >> >>>>> and post and update, but keep us posted here so we can support = you... >> >>>>> >> >>>>> JM >> >>>>> >> >>>>> >> >>>>> 2014-08-26 7:34 GMT-04:00 Arthur.hk.chan@gmail.com < >> >>>> arthur.hk.chan@gmail.com >> >>>>>> : >> >>>>> >> >>>>>> Hi, >> >>>>>> >> >>>>>> I need to install snappy to HBase 0.98.4. (my Hadoop version = is >> >> 2.4.1) >> >>>>>> >> >>>>>> Can you please advise what would be wrong? Should my pom.xml = be >> >>>> incorrect >> >>>>>> and missing something? >> >>>>>> >> >>>>>> Regards >> >>>>>> Arthur >> >>>>>> >> >>>>>> >> >>>>>> Below are my commands: >> >>>>>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4 = 0.98.4-hadoop2 >> >>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single >> >>>>>> -Prelease,hadoop-snappy >> >>>>>> >> >>>>>> Iog: >> >>>>>> [INFO] >> >>>>>> >> >> = ------------------------------------------------------------------------ >> >>>>>> [INFO] Building HBase - Server 0.98.4-hadoop2 >> >>>>>> [INFO] >> >>>>>> >> >> = ------------------------------------------------------------------------ >> >>>>>> [WARNING] The POM for >> >> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT >> >>>>>> is missing, no dependency information available >> >>>>>> [INFO] >> >>>>>> >> >> = ------------------------------------------------------------------------ >> >>>>>> [INFO] Reactor Summary: >> >>>>>> [INFO] >> >>>>>> [INFO] HBase ............................................. = SUCCESS >> >>>> [3.129s] >> >>>>>> [INFO] HBase - Common .................................... = SUCCESS >> >>>> [3.105s] >> >>>>>> [INFO] HBase - Protocol .................................. = SUCCESS >> >>>> [0.976s] >> >>>>>> [INFO] HBase - Client .................................... = SUCCESS >> >>>> [0.925s] >> >>>>>> [INFO] HBase - Hadoop Compatibility ...................... = SUCCESS >> >>>> [0.183s] >> >>>>>> [INFO] HBase - Hadoop Two Compatibility .................. = SUCCESS >> >>>> [0.497s] >> >>>>>> [INFO] HBase - Prefix Tree ............................... = SUCCESS >> >>>> [0.407s] >> >>>>>> [INFO] HBase - Server .................................... = FAILURE >> >>>> [0.103s] >> >>>>>> [INFO] HBase - Testing Util .............................. = SKIPPED >> >>>>>> [INFO] HBase - Thrift .................................... = SKIPPED >> >>>>>> [INFO] HBase - Shell ..................................... = SKIPPED >> >>>>>> [INFO] HBase - Integration Tests ......................... = SKIPPED >> >>>>>> [INFO] HBase - Examples .................................. = SKIPPED >> >>>>>> [INFO] HBase - Assembly .................................. = SKIPPED >> >>>>>> [INFO] >> >>>>>> >> >> = ------------------------------------------------------------------------ >> >>>>>> [INFO] BUILD FAILURE >> >>>>>> [INFO] >> >>>>>> >> >> = ------------------------------------------------------------------------ >> >>>>>> [INFO] Total time: 9.939s >> >>>>>> [INFO] Finished at: Tue Aug 26 19:23:14 HKT 2014 >> >>>>>> [INFO] Final Memory: 61M/2921M >> >>>>>> [INFO] >> >>>>>> >> >> = ------------------------------------------------------------------------ >> >>>>>> [ERROR] Failed to execute goal on project hbase-server: Could = not >> >>>> resolve >> >>>>>> dependencies for project >> >>>> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2: >> >>>>>> Failure to find = org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in >> >>>>>> http://maven.oschina.net/content/groups/public/ was cached in = the >> >> local >> >>>>>> repository, resolution will not be reattempted until the = update >> >>>> interval of >> >>>>>> nexus-osc has elapsed or updates are forced -> [Help 1] >> >>>>>> [ERROR] >> >>>>>> [ERROR] To see the full stack trace of the errors, re-run = Maven with >> >> the >> >>>>>> -e switch. >> >>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug = logging. >> >>>>>> [ERROR] >> >>>>>> [ERROR] For more information about the errors and possible = solutions, >> >>>>>> please read the following articles: >> >>>>>> [ERROR] [Help 1] >> >>>>>> >> >>>> >> >> = http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionExcep= tion >> >>>>>> [ERROR] >> >>>>>> [ERROR] After correcting the problems, you can resume the = build with >> >> the >> >>>>>> command >> >>>>>> [ERROR] mvn -rf :hbase-server >> >>>>>> >> >>>>>> >> >>>> >> >>>> >> >> >> >> >>=20 >>=20 >>=20 >>=20 >> --=20 >> Sean >=20 --Apple-Mail=_85136165-E073-41BF-A763-205A66082BAF--