Return-Path: X-Original-To: apmail-hbase-user-archive@www.apache.org Delivered-To: apmail-hbase-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 0737FD3F5 for ; Mon, 3 Dec 2012 15:51:16 +0000 (UTC) Received: (qmail 56420 invoked by uid 500); 3 Dec 2012 15:51:13 -0000 Delivered-To: apmail-hbase-user-archive@hbase.apache.org Received: (qmail 56265 invoked by uid 500); 3 Dec 2012 15:51:13 -0000 Mailing-List: contact user-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hbase.apache.org Delivered-To: mailing list user@hbase.apache.org Received: (qmail 56250 invoked by uid 99); 3 Dec 2012 15:51:13 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 03 Dec 2012 15:51:13 +0000 X-ASF-Spam-Status: No, hits=-0.0 required=5.0 tests=RCVD_IN_DNSWL_LOW,SPF_NEUTRAL X-Spam-Check-By: apache.org Received-SPF: neutral (nike.apache.org: local policy) Received: from [209.85.223.169] (HELO mail-ie0-f169.google.com) (209.85.223.169) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 03 Dec 2012 15:51:06 +0000 Received: by mail-ie0-f169.google.com with SMTP id c14so5242479ieb.14 for ; Mon, 03 Dec 2012 07:50:46 -0800 (PST) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type:content-transfer-encoding:x-gm-message-state; bh=V50U1zs6ELidVB/L3wlexHqzOVlHAtGo+kMp12YuPPE=; b=Ah0PZPVUMRqcORZnc/OJMl+f2/QYu9D1DyRmZYxH78H9zcX/MlA6t+p+ftQaEOJjWN GstdCZyNuVXztXk5q46bEigixj/7mC5nli9Kv6pCoXnTSJgSFSaDiOYx02OBTXm1gkZG AuZhzXjL6zvQ/sIjvA6EkXfzzLyAN2oVvJgyGP3Cpvbxzbr8Y4y+A0AVH5bIvYBu0480 DcSnc+GmITwFLjOAb7HtTMWXFVZh85tPDDTgtD42GomAHlZ5xLDm5yg0Vs5lGhDpPpje V0bHo3JVv1ksZboJgviHiwZ3nxhy1f7sIDww+ZBZuLPn6H4ecL8T3EfeTtWljUd8jXKJ weeQ== MIME-Version: 1.0 Received: by 10.50.153.169 with SMTP id vh9mr6610961igb.67.1354549845731; Mon, 03 Dec 2012 07:50:45 -0800 (PST) Received: by 10.64.97.36 with HTTP; Mon, 3 Dec 2012 07:50:45 -0800 (PST) In-Reply-To: References: <50BC4E92.8070601@crs4.it> Date: Mon, 3 Dec 2012 10:50:45 -0500 Message-ID: Subject: Re: How to install Snappy? From: Jean-Marc Spaggiari To: user@hbase.apache.org Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable X-Gm-Message-State: ALoCoQkMm7Wv0QOTpv/hFaadT8PczFBFj+uCQkAzRKdl0mrtSt3ShAZo/Tki+KVdUvd5/NtEnGyI X-Virus-Checked: Checked by ClamAV on apache.org Ok. I retried and on brand new 0.94.3 installation and the only 2 things required are the libsnappy.so and the libhadoop.so files... And it'S taking 5 minutes to install ;) I will now deploy that all over the cluster and give snappy+0.94.3RC a try.= .. 2012/12/3, Kevin O'dell : > Never say die! > > On Mon, Dec 3, 2012 at 10:15 AM, Jean-Marc Spaggiari < > jean-marc@spaggiari.org> wrote: > >> Ok, I got it!!!! >> >> I had to copy the hadoop native libs into the hbase native libs >> directory! >> >> Now I get a SUCCESS when I'm doint the CompressionTest... >> >> I'm not 100% sure that it's the only think which was missing because I >> have done so many modifications in the last 3 days... >> >> So I will start from a blank 0.94.3 jar and re-do all the steps to >> make sure it's just the native libs which need to be copied. >> >> I was close to surrender ;) >> >> JM >> >> 2012/12/3, Jean-Marc Spaggiari : >> > Hi Kevin, >> > >> > Thanks for the clarification. >> > >> > No, it's not what I'm seeing. >> > >> > Here is what I'm getting: >> > >> > 12/12/03 09:40:42 WARN snappy.LoadSnappy: Snappy native library is >> > available >> > 12/12/03 09:40:42 WARN snappy.LoadSnappy: Snappy native library not >> loaded >> > Exception in thread "main" java.lang.RuntimeException: native snappy >> > library not available >> > at >> > >> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.= java:123) >> > at >> > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:1= 00) >> > at >> > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:1= 12) >> > at >> > >> org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.getCompressor(Com= pression.java:264) >> > at >> > >> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.(HFileBlock.jav= a:739) >> > at >> > >> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.= java:127) >> > at >> > >> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.(HFileWriterV2.java= :118) >> > at >> > >> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWri= ter(HFileWriterV2.java:101) >> > at >> > >> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:3= 94) >> > at >> > >> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest= .java:108) >> > at >> > >> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:1= 38) >> > >> > The most disturbing part is this line: >> > 12/12/03 09:40:42 WARN snappy.LoadSnappy: Snappy native library is >> > available >> > >> > Followed by this one: >> > Exception in thread "main" java.lang.RuntimeException: native snappy >> > library not available >> > >> > Is it available? Or is it not available? >> > >> > I looked in the code and I have no idea why the 2nd one is raised. >> > >> > The code I'm looking at is on the hadoop-snappy site, but the one I >> > have on my server is in the hadoop the hadoop-core-1.0.3.jar file. So >> > maybe that the issue and they are different? >> > >> > I built the hadoop-snappy-0.0.1-SNAPSHOT.jar file too. I placed it on >> > the lib folder and made sure it was taken first, but still not >> > working. >> > >> > So far I think I will stay with GZip until Snappy is integrated on the >> > HBase files... >> > >> > JM >> > >> > 2012/12/3, Kevin O'dell : >> >> Hey JM, >> >> >> >> Sorry for the quick message earlier. I tracked down the JIRA I was >> >> referring to: https://issues.apache.org/jira/browse/HBASE-7080 >> >> >> >> Does this look like what you are seeing in Compression test? >> >> >> >> On Mon, Dec 3, 2012 at 9:09 AM, Kevin O'dell >> >> wrote: >> >> >> >>> There is a compression test JIRA right now. What are you seeing? >> >>> >> >>> >> >>> On Mon, Dec 3, 2012 at 8:47 AM, Jean-Marc Spaggiari < >> >>> jean-marc@spaggiari.org> wrote: >> >>> >> >>>> Ok.... >> >>>> >> >>>> This: >> >>>> http://code.google.com/p/hadoop-snappy/issues/detail?id=3D2helped >> >>>> me and my test program is now working. I'm able to load both >> >>>> libraries. Fine. >> >>>> >> >>>> But the CompressionTest is still not working. >> >>>> >> >>>> What is very strange is that: >> >>>> 12/12/03 08:44:24 WARN snappy.LoadSnappy: Snappy native library is >> >>>> available >> >>>> 12/12/03 08:44:24 WARN snappy.LoadSnappy: Snappy native library not >> >>>> loaded >> >>>> >> >>>> It's available, but not loaded. >> >>>> >> >>>> But from the code: >> >>>> static { >> >>>> try { >> >>>> System.loadLibrary("snappy"); >> >>>> System.loadLibrary("hadoopsnappy"); >> >>>> LOG.warn("Snappy native library is available"); >> >>>> AVAILABLE =3D true; >> >>>> } catch (UnsatisfiedLinkError ex) { >> >>>> //NOP >> >>>> } >> >>>> LOADED =3D AVAILABLE; >> >>>> if (LOADED) { >> >>>> LOG.info("Snappy native library loaded"); >> >>>> } else { >> >>>> LOG.warn("Snappy native library not loaded"); >> >>>> } >> >>>> } >> >>>> If "Snappy native library is available" is displayed, that mean >> >>>> AVAILABLE =3D true... And if AVAILABLE =3D true, then LOADED is set= to >> >>>> true and Snappy native library loaded must be displayed... But it's >> >>>> not... How is this possible? >> >>>> >> >>>> I have not expected Snappy installation to be such a challenge... >> >>>> >> >>>> I will continue to dig and summarize the steps when I will be done >> >>>> (If >> >>>> I'm able to finish...) >> >>>> >> >>>> JM >> >>>> >> >>>> 2012/12/3, Jean-Marc Spaggiari : >> >>>> > Thanks all for your replies. >> >>>> > >> >>>> > So, to reply to all in one. >> >>>> > >> >>>> > I'm not using CD3. I'm using Hadoop 1.0.3 and HBase 0.94.2 >> >>>> > directly >> >>>> > from the JARs. >> >>>> > >> >>>> > Here are all the places where I have put the lib: >> >>>> > /home/hadoop/hadoop-1.0.3/lib/native/Linux-amd64-64/libsnappy.so >> >>>> > /home/hadoop/hadoop-1.0.3/lib/native/Linux-amd64-64/libsnappy.so.= 1 >> >>>> > >> /home/hadoop/hadoop-1.0.3/lib/native/Linux-amd64-64/libsnappy.so.1.1.3 >> >>>> > /home/hadoop/hadoop-1.0.3/lib/native/Linux-i386-32/libsnappy.so >> >>>> > /home/hadoop/hadoop-1.0.3/lib/native/Linux-i386-32/libsnappy.so.1 >> >>>> > >> /home/hadoop/hadoop-1.0.3/lib/native/Linux-i386-32/libsnappy.so.1.1.3 >> >>>> > /home/hbase/hbase-0.94.2/lib/native/libsnappy.so >> >>>> > /home/hbase/hbase-0.94.2/lib/native/libsnappy.so.1 >> >>>> > /home/hbase/hbase-0.94.2/lib/native/libsnappy.so.1.1.3 >> >>>> > /home/hbase/hbase-0.94.2/lib/native/Linux-amd64-64/libsnappy.so >> >>>> > /home/hbase/hbase-0.94.2/lib/native/Linux-amd64-64/libsnappy.so.1 >> >>>> > >> /home/hbase/hbase-0.94.2/lib/native/Linux-amd64-64/libsnappy.so.1.1.3 >> >>>> > /home/hbase/hbase-0.94.2/lib/native/Linux-i386-32/libsnappy.so >> >>>> > /home/hbase/hbase-0.94.2/lib/native/Linux-i386-32/libsnappy.so.1 >> >>>> > /home/hbase/hbase-0.94.2/lib/native/Linux-i386-32/libsnappy.so.1.= 1.3 >> >>>> > /lib/x86_64-linux-gnu/libsnappy.so >> >>>> > /usr/lib/libsnappy.so >> >>>> > /usr/lib/libsnappy.so.1 >> >>>> > /usr/lib/libsnappy.so.1.1.3 >> >>>> > /usr/local/lib/libsnappy.so >> >>>> > /usr/local/lib/libsnappy.so.1 >> >>>> > /usr/local/lib/libsnappy.so.1.1.3 >> >>>> > >> >>>> > I tried to add this on my hbase-env.xml: >> >>>> > export >> >>>> > >> HBASE_LIBRARY_PATH=3D/home/hbase/hbase-0.94.2/lib/native/Linux-amd64-64 >> >>>> > >> >>>> > Before I was trying with doing export on the command line directl= y >> >>>> > since it seems the hbase script is taking that into consideration >> >>>> > too. >> >>>> > >> >>>> > I have not yet put the hbase.regionserver.codecs line since I >> >>>> > still >> >>>> > need to use my cluster until I get snappy working. On the >> >>>> > hbase/lib >> >>>> > directory I have snappy-java-1.0.3.2.jar. >> >>>> > >> >>>> > >> >>>> > Should snappy be installed within hbase? Or should it be in >> >>>> > hadoop? >> >>>> > I'm not sure anymore. >> >>>> > >> >>>> > But it's still not working. So I tried the small code below: >> >>>> > >> >>>> > import java.util.StringTokenizer; >> >>>> > >> >>>> > public class Test >> >>>> > { >> >>>> > static { >> >>>> > try { >> >>>> > System.loadLibrary("snappy"); >> >>>> > System.loadLibrary("hadoopsnappy"); >> >>>> > System.out.println ("Snappy native library is available"); >> >>>> > } catch (UnsatisfiedLinkError ex) { >> >>>> > ex.printStackTrace(); >> >>>> > } >> >>>> > } >> >>>> > >> >>>> > public static void main (String [] args) >> >>>> > { >> >>>> > System.out.println ("Coucou"); >> >>>> > String property =3D System.getProperty("java.library.path"); >> >>>> > StringTokenizer parser =3D new StringTokenizer(property, ";"); >> >>>> > while (parser.hasMoreTokens()) { >> >>>> > System.err.println(parser.nextToken()); >> >>>> > } >> >>>> > } >> >>>> > } >> >>>> > >> >>>> > >> >>>> > This code is from org.apache.hadoop.io.compress.snappy.LoadSnappy= . >> >>>> > The error I'm getting is java.lang.UnsatisfiedLinkError: no >> >>>> > hadoopsnappy in java.library.path. >> >>>> > >> >>>> > So the issue is not the snappy lib. It' there and working fine. >> >>>> > The >> >>>> > issue is the hadoopsnappy lib which I don't have... >> >>>> > >> >>>> > I found it there: http://code.google.com/p/hadoop-snappy/ >> >>>> > >> >>>> > So I have extracted it with svn checkout >> >>>> > http://hadoop-snappy.googlecode.com/svn/trunk/ >> >>>> > hadoop-snappy-read-only, tried to built it with mvn package but >> >>>> > it's >> >>>> > failing with something saying "cannot find -ljvm" >> >>>> > >> >>>> > So seems my challenge will be to build hadoop-snappy and not to >> >>>> > install snappy which is already there and working... >> >>>> > >> >>>> > JM >> >>>> > >> >>>> > 2012/12/3, surfer : >> >>>> >> hope it helps. this is what I do on apache hadoop 1.0.x and hbas= e >> >>>> 0.92.y: >> >>>> >> in hbase-site.xml add: >> >>>> >> >> >>>> >> >> >>>> >> hbase.regionserver.codecs >> >>>> >> snappy >> >>>> >> >> >>>> >> >> >>>> >> copy that file into the hadoop conf directory. >> >>>> >> >> >>>> >> in hbase-env.sh: >> >>>> >> export >> >>>> >> HBASE_LIBRARY_PATH=3D/pathtoyourhadoop/lib/native/Linux-amd64-64 >> >>>> >> >> >>>> >> ( In hbase-env.sh I set also HBASE_HOME, HBASE_CONF_DIR, >> >>>> >> HADOOP_HOME, >> >>>> >> HADOOP_CONF_DIR but I don't know if they contribute to make >> >>>> >> snappy >> >>>> >> working...) >> >>>> >> >> >>>> >> in /pathtoyourhadoop/lib/native/Linux-amd64-64 I have: >> >>>> >> libsnappy.a >> >>>> >> libsnappy.so >> >>>> >> libsnappy.so.1 >> >>>> >> libsnappy.so.1.1.2 >> >>>> >> >> >>>> >> good luck >> >>>> >> giovanni >> >>>> >> >> >>>> >> >> >>>> >> >> >>>> >> >> >>>> >> >> >>>> >> >> >>>> >> On 12/02/2012 02:25 PM, Jean-Marc Spaggiari wrote: >> >>>> >>> So. I spent few hours on that yesterday with no luck. >> >>>> >>> >> >>>> >>> Here is what I did: >> >>>> >>> - Install the google tar, untared, configured, maked and >> >>>> >>> installed >> >>>> >>> it. >> >>>> >>> - Copied the .so files all over my fs in the os lib dir, >> >>>> >>> HBase/lib/native and subdirs, Hadoop/lib/native and subdirs. >> >>>> >>> - Installed all debian packages with snappy in the name: >> >>>> >>> python-snappy, libsnappy-dev, libsnappy1, libsnappy-java >> >>>> >>> >> >>>> >>> But still exactly the same issue as above. And I don't have any >> >>>> >>> clue >> >>>> >>> where to dig. There is nothing on internet about that. >> >>>> >>> >> >>>> >>> Anyone faced that already while installing Snappy? >> >>>> >>> >> >>>> >>> JM >> >>>> >>> >> >>>> >>> 2012/12/1, Jean-Marc Spaggiari : >> >>>> >>>> Sorry, I forgot to paste few maybe useful lines. I have the li= b >> in >> >>>> >>>> /usr/local/lib copied properly, and I have the >> >>>> >>>> HBASE_LIBRARY_PATH >> >>>> >>>> set >> >>>> >>>> correctly. Do I need to restart HBase to run this test? >> >>>> >>>> >> >>>> >>>> hbase@node3:~/hbase-0.94.2$ export >> >>>> HBASE_LIBRARY_PATH=3D/usr/local/lib/ >> >>>> >>>> hbase@node3:~/hbase-0.94.2$ bin/hbase >> >>>> >>>> org.apache.hadoop.hbase.util.CompressionTest /tmp/test.txt >> >>>> >>>> snappy >> >>>> >>>> 12/12/01 18:55:29 INFO util.ChecksumType: >> >>>> >>>> org.apache.hadoop.util.PureJavaCrc32 not available. >> >>>> >>>> 12/12/01 18:55:29 INFO util.ChecksumType: Checksum can use >> >>>> >>>> java.util.zip.CRC32 >> >>>> >>>> 12/12/01 18:55:29 INFO util.ChecksumType: >> >>>> >>>> org.apache.hadoop.util.PureJavaCrc32C not available. >> >>>> >>>> 12/12/01 18:55:29 DEBUG util.FSUtils: Creating >> >>>> >>>> file:/tmp/test.txtwith >> >>>> >>>> permission:rwxrwxrwx >> >>>> >>>> 12/12/01 18:55:29 WARN util.NativeCodeLoader: Unable to load >> >>>> >>>> native-hadoop library for your platform... using builtin-java >> >>>> >>>> classes >> >>>> >>>> where applicable >> >>>> >>>> 12/12/01 18:55:29 WARN metrics.SchemaConfigured: Could not >> >>>> >>>> determine >> >>>> >>>> table and column family of the HFile path /tmp/test.txt. >> Expecting >> >>>> >>>> at >> >>>> >>>> least 5 path components. >> >>>> >>>> 12/12/01 18:55:29 WARN snappy.LoadSnappy: Snappy native librar= y >> is >> >>>> >>>> available >> >>>> >>>> 12/12/01 18:55:29 WARN snappy.LoadSnappy: Snappy native librar= y >> >>>> >>>> not >> >>>> >>>> loaded >> >>>> >>>> Exception in thread "main" java.lang.RuntimeException: native >> >>>> >>>> snappy >> >>>> >>>> library not available >> >>>> >>>> at >> >>>> >>>> >> >>>> >> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.= java:123) >> >>>> >>>> at >> >>>> >>>> >> >>>> >> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:100= ) >> >>>> >>>> at >> >>>> >>>> >> >>>> >> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:112= ) >> >>>> >>>> at >> >>>> >>>> >> >>>> >> org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.getCompressor(Com= pression.java:264) >> >>>> >>>> at >> >>>> >>>> >> >>>> >> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.(HFileBlock.jav= a:739) >> >>>> >>>> at >> >>>> >>>> >> >>>> >> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.= java:127) >> >>>> >>>> at >> >>>> >>>> >> >>>> >> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.(HFileWriterV2.java= :118) >> >>>> >>>> at >> >>>> >>>> >> >>>> >> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWri= ter(HFileWriterV2.java:101) >> >>>> >>>> at >> >>>> >>>> >> >>>> >> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:3= 94) >> >>>> >>>> at >> >>>> >>>> >> >>>> >> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest= .java:108) >> >>>> >>>> at >> >>>> >>>> >> >>>> >> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:1= 38) >> >>>> >>>> hbase@node3:~/hbase-0.94.2$ ll /usr/local/lib/ >> >>>> >>>> total 572 >> >>>> >>>> -rw-r--r-- 1 root staff 391614 d=C3=A9c 1 18:33 libsnappy.a >> >>>> >>>> -rwxr-xr-x 1 root staff 957 d=C3=A9c 1 18:33 libsnappy.la >> >>>> >>>> lrwxrwxrwx 1 root staff 18 d=C3=A9c 1 18:33 libsnappy.so = -> >> >>>> >>>> libsnappy.so.1.1.3 >> >>>> >>>> lrwxrwxrwx 1 root staff 18 d=C3=A9c 1 18:33 libsnappy.so.= 1 -> >> >>>> >>>> libsnappy.so.1.1.3 >> >>>> >>>> -rwxr-xr-x 1 root staff 178210 d=C3=A9c 1 18:33 libsnappy.so.= 1.1.3 >> >>>> >>>> drwxrwsr-x 4 root staff 4096 jui 13 10:06 python2.6 >> >>>> >>>> drwxrwsr-x 4 root staff 4096 jui 13 10:06 python2.7 >> >>>> >>>> hbase@node3:~/hbase-0.94.2$ >> >>>> >>>> >> >>>> >>>> >> >>>> >>>> 2012/12/1, Jean-Marc Spaggiari : >> >>>> >>>>> Hi, >> >>>> >>>>> >> >>>> >>>>> I'm currently using GZip and want to move to Snappy. >> >>>> >>>>> >> >>>> >>>>> I have downloaded the tar file, extracted, build, make >> >>>> >>>>> install, >> >>>> >>>>> make >> >>>> >>>>> check, everything is working fine. >> >>>> >>>>> >> >>>> >>>>> However, I'm not able to get this working: >> >>>> >>>>> bin/hbase org.apache.hadoop.hbase.util.CompressionTest >> >>>> >>>>> /tmp/test.txt >> >>>> >>>>> snappy >> >>>> >>>>> 12/12/01 18:46:21 WARN snappy.LoadSnappy: Snappy native >> >>>> >>>>> library >> >>>> >>>>> not >> >>>> >>>>> loaded >> >>>> >>>>> Exception in thread "main" java.lang.RuntimeException: native >> >>>> >>>>> snappy >> >>>> >>>>> library not available >> >>>> >>>>> at >> >>>> >>>>> >> >>>> >> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.= java:123) >> >>>> >>>>> >> >>>> >>>>> Sound like HBase is not able to find the native library. How >> >>>> >>>>> can >> >>>> >>>>> I >> >>>> >>>>> tell HBase where the library is? >> >>>> >>>>> >> >>>> >>>>> Thanks, >> >>>> >>>>> >> >>>> >>>>> JM >> >>>> >>>>> >> >>>> >> >> >>>> >> >> >>>> > >> >>>> >> >>> >> >>> >> >>> >> >>> -- >> >>> Kevin O'Dell >> >>> Customer Operations Engineer, Cloudera >> >>> >> >> >> >> >> >> >> >> -- >> >> Kevin O'Dell >> >> Customer Operations Engineer, Cloudera >> >> >> > >> > > > > -- > Kevin O'Dell > Customer Operations Engineer, Cloudera >