hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From GUOJUN Zhu <guojun_...@freddiemac.com>
Subject Re: snappy codec
Date Mon, 11 Jun 2012 14:12:25 GMT
I believe that the mapred.child.env only concerns the spawned child 
process where the map is run.  You probably need to put this path into the 
system property where the map task demon runs and possibly where the job 
controller runs. 

Zhu, Guojun
Modeling Sr Graduate
571-3824370
guojun_zhu@freddiemac.com
Financial Engineering
Freddie Mac



   Marek Miglinski <mmiglinski@seven.com> 
   06/11/2012 09:54 AM
   Please respond to
mapreduce-user@hadoop.apache.org


To
"mapreduce-user@hadoop.apache.org" <mapreduce-user@hadoop.apache.org>
cc

Subject
snappy codec






Hi,

I have a Clouderas CDH3u3 installed on my cluster and mapred.child.env set 
to "LD_LIBRARY_PATH=/usr/lib/hadoop-0.20/lib/native/Linux-amd64-64" (with 
libsnappy.so in the folder) in mapred-site.xml. Cloudera says that Snappy 
is included in their hadoop-0.20-native package and it is also installed 
on each of the nodes.

But when I run a mapreduce task with "mapred.map.output.compression.codec" 
set to "org.apache.hadoop.io.compress.SnappyCodec" I get an exception:

java.lang.RuntimeException: native snappy library not available


Any idea why?



Thanks,
Marek M.


Mime
View raw message