hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Harsh J <ha...@cloudera.com>
Subject Re: snappy codec
Date Mon, 11 Jun 2012 14:06:36 GMT
Hi Marek,

Moving this to cdh-user@cloudera.org as its CDH specific. I've bcc'd
mapreduce-user@ and cc'd you in case you aren't a subscriber (@

A few questions though:
- What OS are you running and what arch? Check via "lsb_release -a"
and "uname -a"
-- Is your JVM also same as your arch? Check via
"/usr/java/default/bin/java -version"
- Why do you provide
LD_LIBRARY_PATH=/usr/lib/hadoop-0.20/lib/native/Linux-amd64-64 as
mapred.child.env? It isn't usually required if you have
hadoop-0.20-native installed and the TaskTrackers were restarted since
- Do other codecs (such as GZip and Deflate (Default)) work fine?

On Mon, Jun 11, 2012 at 7:24 PM, Marek Miglinski <mmiglinski@seven.com> wrote:
> Hi,
> I have a Clouderas CDH3u3 installed on my cluster and mapred.child.env set to "LD_LIBRARY_PATH=/usr/lib/hadoop-0.20/lib/native/Linux-amd64-64"
(with libsnappy.so in the folder) in mapred-site.xml. Cloudera says that Snappy is included
in their hadoop-0.20-native package and it is also installed on each of the nodes.
> But when I run a mapreduce task with "mapred.map.output.compression.codec" set to "org.apache.hadoop.io.compress.SnappyCodec"
I get an exception:
> java.lang.RuntimeException: native snappy library not available
> Any idea why?
> Thanks,
> Marek M.

Harsh J

View raw message