Return-Path: X-Original-To: apmail-hbase-user-archive@www.apache.org Delivered-To: apmail-hbase-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 2C3489D84 for ; Mon, 23 Apr 2012 15:24:23 +0000 (UTC) Received: (qmail 88864 invoked by uid 500); 23 Apr 2012 15:24:21 -0000 Delivered-To: apmail-hbase-user-archive@hbase.apache.org Received: (qmail 88820 invoked by uid 500); 23 Apr 2012 15:24:21 -0000 Mailing-List: contact user-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hbase.apache.org Delivered-To: mailing list user@hbase.apache.org Received: (qmail 88811 invoked by uid 99); 23 Apr 2012 15:24:21 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 23 Apr 2012 15:24:21 +0000 X-ASF-Spam-Status: No, hits=-0.7 required=5.0 tests=RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of peter.s.naudus@gmail.com designates 209.85.212.41 as permitted sender) Received: from [209.85.212.41] (HELO mail-vb0-f41.google.com) (209.85.212.41) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 23 Apr 2012 15:24:14 +0000 Received: by vbbey12 with SMTP id ey12so11555915vbb.14 for ; Mon, 23 Apr 2012 08:23:53 -0700 (PDT) Received: by 10.220.150.205 with SMTP id z13mr16267470vcv.19.1335194633440; Mon, 23 Apr 2012 08:23:53 -0700 (PDT) Received: from localhost.localdomain (50-76-22-1-static.hfc.comcastbusiness.net. [50.76.22.1]) by mx.google.com with ESMTPS id iz3sm26454420vdb.11.2012.04.23.08.23.51 (version=TLSv1/SSLv3 cipher=OTHER); Mon, 23 Apr 2012 08:23:52 -0700 (PDT) Content-Type: text/plain; charset=utf-8; format=flowed; delsp=yes To: user@hbase.apache.org, "Nathaniel Cook" Subject: Re: java.io.IOException: Compression algorithm 'snappy' previously failed test Reply-To: pnaudus@dataraker.com References: Date: Mon, 23 Apr 2012 11:23:50 -0400 MIME-Version: 1.0 Content-Transfer-Encoding: 7bit From: "Peter Naudus" Organization: DataRaker Message-ID: In-Reply-To: User-Agent: Opera Mail/11.62 (Linux) Hi Nathaniel, My memory's a little hazy (we ended going back to CDH3), but I believe the key to fixing this problem for me was the following log message: > WARN util.NativeCodeLoader: Unable to load native-hadoop library for > your platform... using builtin-java classes where applicable I had set HBASE_LIBRARY_PATH to hadoop's library path since that was where snappy was. Once I set the path ( or linked created symbolic links ) so that HBASE_LIBRARY_PATH included both HBase's and Hadoop's the snappy issue was resolved. Also, in general, I ended up having a lot better luck with the RPMs instead of the tarballs. ~ Peter On Mon, 23 Apr 2012 11:01:19 -0400, Nathaniel Cook wrote: > Was there any resolution to this? I am experiencing the same issue. > > Nathaniel > > On Wed, Feb 29, 2012 at 10:52 AM, Peter Naudus > wrote: >> Thanks for your help :) >> >> To make sure I manually set LD_LIBRARY_PATH, LIBRARY_PATH, and >> HBASE_LIBRARY_PATH >> >> bash-3.2$ export >> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/dataraker/software/cdh4/hadoop-0.23.0-cdh4b1/lib/native >> bash-3.2$ export >> LIBRARY_PATH=$LIBRARY_PATH:/opt/dataraker/software/cdh4/hadoop-0.23.0-cdh4b1/lib/native >> bash-3.2$ export >> HBASE_LIBRARY_PATH=/opt/dataraker/software/cdh4/hadoop-0.23.0-cdh4b1/lib/native >> >> But running the compression test failed with "native snappy library not >> available" >> >> bash-3.2$ ./hbase org.apache.hadoop.hbase.util.CompressionTest >> file:///tmp/test.txt snappy >> log4j:WARN No appenders could be found for logger >> (org.apache.hadoop.conf.Configuration). >> log4j:WARN Please initialize the log4j system properly. >> log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for >> more info. >> Exception in thread "main" java.lang.RuntimeException: native snappy >> library >> not available >> at >> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:121) >> at >> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:104) >> at >> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:118) >> at >> org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.getCompressor(Compression.java:236) >> at >> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.(HFileBlock.java:588) >> at >> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:178) >> at >> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.(HFileWriterV2.java:150) >> at >> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.(HFileWriterV2.java:140) >> at >> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:104) >> at >> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:108) >> at >> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:137) >> >> I verified that libsnappy is indeed, installed >> >> bash-3.2$ ls -al $HBASE_LIBRARY_PATH >> total 1412 >> drwxr-xr-x 2 1106 592 4096 Feb 11 01:06 . >> drwxr-xr-x 3 1106 592 4096 Feb 11 01:06 .. >> -rw-r--r-- 1 1106 592 616862 Feb 11 01:06 libhadoop.a >> -rwxr-xr-x 1 1106 592 1051 Feb 11 01:06 libhadoop.la >> lrwxrwxrwx 1 1106 592 18 Feb 27 18:12 libhadoop.so -> >> libhadoop.so.1.0.0 >> lrwxrwxrwx 1 1106 592 18 Feb 27 18:12 libhadoop.so.1 -> >> libhadoop.so.1.0.0 >> -rwxr-xr-x 1 1106 592 340361 Feb 11 01:06 libhadoop.so.1.0.0 >> -rw-r--r-- 1 1106 592 184418 Feb 11 01:06 libhdfs.a >> -rwxr-xr-x 1 1106 592 1034 Feb 11 01:06 libhdfs.la >> lrwxrwxrwx 1 1106 592 16 Feb 27 18:12 libhdfs.so -> libhdfs.so.0.0.0 >> lrwxrwxrwx 1 1106 592 16 Feb 27 18:12 libhdfs.so.0 -> >> libhdfs.so.0.0.0 >> -rwxr-xr-x 1 1106 592 125455 Feb 11 01:06 libhdfs.so.0.0.0 >> -rw-r--r-- 1 1106 592 37392 Feb 11 01:06 libsnappy.a >> lrwxrwxrwx 1 1106 592 18 Feb 27 18:12 libsnappy.so -> >> libsnappy.so.1.1.1 >> lrwxrwxrwx 1 1106 592 18 Feb 27 18:12 libsnappy.so.1 -> >> libsnappy.so.1.1.1 >> -rw-r--r-- 1 1106 592 26824 Feb 11 01:06 libsnappy.so.1.1.1 >> >> Just for grins and giggles I re-ran this as root >> >> In addition to the Exception mentioned above, I also got following >> Warning: >> WARN util.NativeCodeLoader: Unable to load native-hadoop library >> for >> your platform... using builtin-java classes where applicable >> >> Any ideas? >> >> >> On Tue, 28 Feb 2012 20:02:38 -0500, Stack wrote: >> >>> On Tue, Feb 28, 2012 at 1:52 PM, Peter Naudus >>> wrote: >>>> >>>> What else can I do to fix / diagnose this problem? >>>> >>> >>> Does our little compression tool help? >>> http://hbase.apache.org/book.html#compression.test >>> >>> St.Ack >> >> >> >> --