hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Rural Hunter <ruralhun...@gmail.com>
Subject Re: Snappy compression question
Date Sat, 04 Jan 2014 05:01:14 GMT
The document is far from complete. It didn't mention the default hadoop 
binary package is compiled without snappy support and you need to 
compile it with snappy option yourself. Actually it didn't work with any 
native libs on 64 bits OS as the libhadoop.so in the binary package is 
only for 32 bits OS. It also din't metion actually you need both snappy 
and hadoop-snappy.

于 2014/1/3 19:20, 张玉雪 写道:
> Hi:
>
>           When I used hadoop 2.2.0 and hbase 0.96.1.1 to using snappy compression
>
>           I followed the topic http://hbase.apache.org/book/snappy.compression.html,
 but I get some error ,can some one help me?
>
>     
>
> [hadoop@master bin]$ hbase org.apache.hadoop.hbase.util.CompressionTest file:///tmp/test222.txt
snappy
>
> 2014-01-03 19:12:41,971 INFO  [main] Configuration.deprecation: hadoop.native.lib is
deprecated. Instead, use io.native.lib.available
>
> SLF4J: Class path contains multiple SLF4J bindings.
>
> SLF4J: Found binding in [jar:file:/home/hadoop/hbase-0.96.1.1-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>
> SLF4J: Found binding in [jar:file:/home/hadoop/hadoop-2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
>
> 2014-01-03 19:12:42,663 INFO  [main] util.ChecksumType: Checksum using org.apache.hadoop.util.PureJavaCrc32
>
> 2014-01-03 19:12:42,670 INFO  [main] util.ChecksumType: Checksum can use org.apache.hadoop.util.PureJavaCrc32C
>
> Exception in thread "main" java.lang.RuntimeException: native snappy library not available:
this version of libhadoop was built without snappy support.
>
>          at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
>
>          at org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:131)
>
>          at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:147)
>
>          at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:162)
>
>          at org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:312)
>
>          at org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:79)
>
>          at org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:719)
>
>          at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:131)
>
>          at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:122)
>
>          at org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:105)
>
>          at org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:426)
>
>          at org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:115)
>
>          at org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:145)
>
>   
>
>   
>
> I have  installed snappy 1.2.0 successfully and hadoop-snappy successfully too… and
also I checked my configuration of core-site.xml and hbase.env.sh
>
> There are all wright. How can I resolve this problem


Mime
View raw message