Return-Path: Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: (qmail 70616 invoked from network); 21 Jul 2010 00:59:09 -0000 Received: from unknown (HELO mail.apache.org) (140.211.11.3) by 140.211.11.9 with SMTP; 21 Jul 2010 00:59:09 -0000 Received: (qmail 20135 invoked by uid 500); 21 Jul 2010 00:59:05 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 19739 invoked by uid 500); 21 Jul 2010 00:59:04 -0000 Mailing-List: contact common-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: common-user@hadoop.apache.org Delivered-To: mailing list common-user@hadoop.apache.org Received: (qmail 19707 invoked by uid 99); 21 Jul 2010 00:59:04 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 21 Jul 2010 00:59:04 +0000 X-ASF-Spam-Status: No, hits=0.0 required=10.0 tests=FREEMAIL_FROM,RCVD_IN_DNSWL_NONE,SPF_PASS,T_TO_NO_BRKTS_FREEMAIL X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of alexander.luya@gmail.com designates 74.125.83.176 as permitted sender) Received: from [74.125.83.176] (HELO mail-pv0-f176.google.com) (74.125.83.176) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 21 Jul 2010 00:58:09 +0000 Received: by pvc21 with SMTP id 21so4742568pvc.35 for ; Tue, 20 Jul 2010 17:56:47 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=gamma; h=domainkey-signature:received:received:from:organization:to:subject :date:user-agent:mime-version:content-type:content-transfer-encoding :message-id; bh=MCiEZPIZa0njoe4UtNLZfuxHUUmtJOFF/xTa9BTuGWY=; b=J7tJPWjpJQwZd6l92ChgvWtzlIx5RY+vY8oJ1E97a9X3hK8mpybjSWdbnpJR0GXzA7 qKveBwpGwzOj4zoGI+hvS1uyMyqbnScfu8SQcxRh2FyIwvhfu3Sw3aIUjcQxY98KbzUZ 6+S+cAjVQvMSPgxXxZWNxHlIaQMLnS+C5RBTY= DomainKey-Signature: a=rsa-sha1; c=nofws; d=gmail.com; s=gamma; h=from:organization:to:subject:date:user-agent:mime-version :content-type:content-transfer-encoding:message-id; b=ug0h84oOJmBSGlpwb9E3H5SurPw7v+Y2jZBCJXkeD7TekxfooejiaDB0mzhazAFgkm hxlnCUyZRcPpe5CgtDZemx+EFTISB/FDAt+f0B55s7UAkYhMStY5D6ET9WkC7V35uwCa GYhtxArY0U8RnJSdPVFlFxidzoC9YE0VdsxYk= Received: by 10.114.52.2 with SMTP id z2mr2598541waz.84.1279673799721; Tue, 20 Jul 2010 17:56:39 -0700 (PDT) Received: from alexluya.localnet ([219.228.105.77]) by mx.google.com with ESMTPS id 33sm84108346wad.18.2010.07.20.17.56.37 (version=SSLv3 cipher=RC4-MD5); Tue, 20 Jul 2010 17:56:39 -0700 (PDT) From: Alex Luya Organization: Athena Inc. To: hdfs-user@hadoop.apache.org, common-user@hadoop.apache.org Subject: Question about LZO(Caused by: java.lang.ClassNotFoundException: com.hadoop.compression.lzo.LzopCodec) Date: Wed, 21 Jul 2010 08:56:35 +0800 User-Agent: KMail/1.13.2 (Linux/2.6.32-24-generic; KDE/4.4.2; x86_64; ; ) MIME-Version: 1.0 Content-Type: Text/Plain; charset="utf-8" Content-Transfer-Encoding: quoted-printable Message-Id: <201007210856.35581.alexander.luya@gmail.com> X-Virus-Checked: Checked by ClamAV on apache.org Hello=EF=BC=9A I got source code from http://github.com/kevinweil/hadoop-lzo=EF=BC=8Cc= ompiled=20 them successfully,and then 1=EF=BC=8Ccopy hadoop-lzo-0.4.4.jar to directory:$HADOOP_HOME/lib of each m= aster and=20 slave 2=EF=BC=8CCopy all files under directory:../Linux-amd64-64 to directory: $HADDOOP_HOME/lib/native/Linux-amd64-64 of each master and slave 3,and upload a file:test.lzo to HDFS 4=EF=BC=8Cthen run=EF=BC=9Ahadoop jar $HADOOP_HOME/lib/hadoop-lzo-0.4.4.jar= =20 com.hadoop.compression.lzo.DistributedLzoIndexer test.lzo to test got errors=EF=BC=9A =2D------------------------------------------------------------------------= =2D------------------------------------------------------------------------= =2D-------------------------- 10/07/20 22:37:37 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library 10/07/20 22:37:37 INFO lzo.LzoCodec: Successfully loaded & initialized nati= ve- lzo library [hadoop-lzo rev 5c25e0073d3dae9ace4bd9eba72e4dc43650c646] ##########^_^^_^^_^^_^^_^^_^##################(I think this says all native= =20 library got loaded successfully)################################ 10/07/20 22:37:37 INFO lzo.DistributedLzoIndexer: Adding LZO file target.lz= :o=20 to indexing list (no index currently exists) 10/07/20 22:37:37 WARN mapred.JobClient: Use GenericOptionsParser for parsi= ng=20 the arguments. Applications should implement Tool for the same. 10/07/20 22:37:38 INFO input.FileInputFormat: Total input paths to process = : 1 10/07/20 22:37:38 INFO mapred.JobClient: Running job: job_201007202234_0001 10/07/20 22:37:39 INFO mapred.JobClient: map 0% reduce 0% 10/07/20 22:37:48 INFO mapred.JobClient: Task Id :=20 attempt_201007202234_0001_m_000000_0, Status : FAILED java.lang.IllegalArgumentException: Compression codec=20 com.hadoop.compression.lzo.LzopCodec not found. at=20 org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(Compr= essionCodecFactory.java:96) at=20 org.apache.hadoop.io.compress.CompressionCodecFactory.(CompressionCod= ecFactory.java:134) at=20 com.hadoop.mapreduce.LzoSplitRecordReader.initialize(LzoSplitRecordReader.j= ava:48) at=20 org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask= =2Ejava:418) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:620) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305) at org.apache.hadoop.mapred.Child.main(Child.java:170) Caused by: java.lang.ClassNotFoundException:=20 com.hadoop.compression.lzo.LzopCodec at java.net.URLClassLoader$1.run(URLClassLoader.java:202) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:190) at java.lang.ClassLoader.loadClass(ClassLoader.java:307) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at java.lang.ClassLoader.loadClass(ClassLoader.java:248) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:247) at=20 org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:762) at=20 org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(Compr= essionCodecFactory.java:89) ... 6 more =2D------------------------------------------------------------------------= =2D------------------------------------------------------------------------= =2D--------------------------- =20 There is a installation instruction in this=20 link:http://github.com/kevinweil/hadoop-lzo=EF=BC=8Cit says other configuri= ngs are=20 needed =EF=BC=9A Once the libs are built and installed, you may want to add them to the clas= s=20 paths and library paths. That is, in hadoop-env.sh, set (1)export HADOOP_CLASSPATH=3D/path/to/your/hadoop-lzo-lib.jar Question:I have copied hadoop-lzo-0.4.4.jar to $HADOOP_HOME/lib, =EF=BC=8Cshould I do set this entry like this again? =EF=BC=8Cactully after= I add this: export HADOOP_CLASSPATH=3D$HADOOP_CLASSPATH:$HBASE_HOME/hbase-0.20.4.jar: $HABSE_HOME/config:$ZOOKEEPER_HOME/zookeeper-3.3.1.jar:$HADOOP_HOME/lib /hadoop-lzo-0.4.4.jar=EF=BC=8Credo 1-4 as above,same problem as before,so h= ow can I=20 get hadoop to load hadoop-lzo-0.4.4.jar=EF=BC=9F=EF=BC=89 (2),export JAVA_LIBRARY_PATH=3D/path/to/hadoop-lzo-native- libs:/path/to/standard-hadoop-native-libs Note that there seems to be a bug in /path/to/hadoop/bin/hadoop; commen= t=20 out the line (3)JAVA_LIBRARY_PATH=3D'' Question=EF=BC=9Asince native library got loaded successfully=EF=BC=8Caren'= t these=20 operation(2)(3) needed=EF=BC=9F =EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF= =BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC= =8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D= =EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF= =BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC= =8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D=EF=BC=8D I am using hadoop 0.20.2 core-site.xml =2D------------------------------------------------------------------------= =2D--- fs.default.name hdfs://hadoop:8020 hadoop.tmp.dir /home/hadoop/tmp io.compression.codecs =09 org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compres= s.DefaultCodec,org.apache.hadoop.io.compress.BZip2Codec,com.hadoop.compress= ion.lzo.LzoCodec,com.hadoop.compression.lzo.LzopCodec io.compression.codec.lzo.class com.hadoop.compression.lzo.LzoCodec =2D------------------------------------------------------------------------= =2D--- mapred-site.xml =2D------------------------------------------------------------------------= =2D--- mapred.job.tracker AlexLuya:9001 mapred.tasktracker.reduce.tasks.maximum 1 mapred.tasktracker.map.tasks.maximum 1 mapred.local.dir /home/alex/hadoop/mapred/local mapred.system.dir /tmp/hadoop/mapred/system mapreduce.map.output.compress true mapreduce.map.output.compress.codec com.hadoop.compression.lzo.LzoCodec =2D------------------------------------------------------------------------= =2D---