hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From German Florez-Larrahondo <german...@samsung.com>
Subject RE: Setting up Snappy compression in Hadoop
Date Thu, 02 Jan 2014 16:06:54 GMT

You may also need to check whether the native library you are using includes
Snappy or not. 


For example, when you compile from source and the libsnappy.so is not found
then snappy support is not included as part of the native library for Hadoop
(for force it to fail if libsnappy is not found the require.snappy flag is


A quick test could be this:


htf@german:~/hadoop/lib/native$ nm libhadoop.so  | grep -i snappy

0000000000005e10 T

0000000000005db0 T

0000000000006200 T

0000000000006450 T

0000000000006860 T

000000000000d030 T

0000000000215958 b SnappyCompressor_clazz

0000000000215970 b SnappyCompressor_compressedDirectBuf

0000000000215978 b SnappyCompressor_directBufferSize

0000000000215960 b SnappyCompressor_uncompressedDirectBuf

0000000000215968 b SnappyCompressor_uncompressedDirectBufLen

0000000000215988 b SnappyDecompressor_clazz

0000000000215990 b SnappyDecompressor_compressedDirectBuf

0000000000215998 b SnappyDecompressor_compressedDirectBufLen

00000000002159a8 b SnappyDecompressor_directBufferSize

00000000002159a0 b SnappyDecompressor_uncompressedDirectBuf

0000000000215980 b dlsym_snappy_compress

00000000002159b0 b dlsym_snappy_uncompress


If you don't see any snappy-related objects in the library, then it hasn't
been compiled with Snappy support. 


Note the info I give you is based on recent Hadoop releases (like 2.2.0),
but something similar should apply for your release.





From: bharath vissapragada [mailto:bharathvissapragada1990@gmail.com] 
Sent: Thursday, January 02, 2014 5:56 AM
To: User
Subject: Re: Setting up Snappy compression in Hadoop


Your natives should be in LD_LIBRARY_PATH or java.library.path for hadoop to
pick them up. You can try adding export HADOOP_OPTS=$HADOOP_OPTS
-Djava.library.path=<Path to your natives lib> to hadoop-env.sh in TTs and
clients/gateways and restart TTs and give it another try. The reason its
working for Hbase is you are manually pointing HBASE_LIBRARY_PATH to the
natives. My guess is they are in a wrong location.


On Thu, Jan 2, 2014 at 5:07 PM, Amit Sela <amits@infolinks.com> wrote:

I did everything mentioned in the link Ted mentioned, and the test actually
works, but using Snappy for MapReduce map output compression still fails
with "native snappy library not available".


On Wed, Jan 1, 2014 at 6:37 PM, bharath vissapragada
<bharathvissapragada1990@gmail.com> wrote:

Did you build it for your platform? You can do an "ldd" on the .so file to
check if the dependent libs are present. Also make sure you placed it in the
right directory for your platform (Linux-amd64-64 or Linux-i386-32)


On Wed, Jan 1, 2014 at 10:02 PM, Ted Yu <yuzhihong@gmail.com> wrote:

Please take a look at http://hbase.apache.org/book.html#snappy.compression




On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <amits@infolinks.com> wrote:

Hi all, 


I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output

I'm adding the configurations:


configuration.setBoolean("mapred.compress.map.output", true);



And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/


Still, all map tasks fail with "native snappy library not available".


Could anyone elaborate on how to install Snappy for Hadoop ?









View raw message