Return-Path: Delivered-To: apmail-hadoop-core-user-archive@www.apache.org Received: (qmail 98780 invoked from network); 1 Oct 2008 00:22:21 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.2) by minotaur.apache.org with SMTP; 1 Oct 2008 00:22:21 -0000 Received: (qmail 34054 invoked by uid 500); 1 Oct 2008 00:22:14 -0000 Delivered-To: apmail-hadoop-core-user-archive@hadoop.apache.org Received: (qmail 33810 invoked by uid 500); 1 Oct 2008 00:22:13 -0000 Mailing-List: contact core-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: core-user@hadoop.apache.org Delivered-To: mailing list core-user@hadoop.apache.org Received: (qmail 33799 invoked by uid 99); 1 Oct 2008 00:22:13 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 30 Sep 2008 17:22:13 -0700 X-ASF-Spam-Status: No, hits=-0.0 required=10.0 tests=SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: local policy) Received: from [208.68.111.125] (HELO mx02.metaweb.com) (208.68.111.125) by apache.org (qpsmtpd/0.29) with SMTP; Wed, 01 Oct 2008 00:21:13 +0000 Received: from zimbra01.corp.sjc1.metaweb.com (localhost [127.0.0.1]) by mx02.metaweb.com (Spam Firewall) with ESMTP id D8ED86DD67 for ; Tue, 30 Sep 2008 17:24:59 -0700 (PDT) Received: from zimbra01.corp.sjc1.metaweb.com (zimbra01.corp.sjc1.metaweb.com [172.29.253.13]) by mx02.metaweb.com with ESMTP id SffFzuO87NQsI4f0 for ; Tue, 30 Sep 2008 17:24:59 -0700 (PDT) Received: from localhost (localhost.localdomain [127.0.0.1]) by zimbra01.corp.sjc1.metaweb.com (Postfix) with ESMTP id BB0E8CDC043 for ; Tue, 30 Sep 2008 17:21:46 -0700 (PDT) X-Virus-Scanned: amavisd-new at X-Spam-Score: -4.399 X-Spam-Level: Received: from zimbra01.corp.sjc1.metaweb.com ([127.0.0.1]) by localhost (zimbra01.corp.sjc1.metaweb.com [127.0.0.1]) (amavisd-new, port 10024) with ESMTP id t6ho68M93EA0 for ; Tue, 30 Sep 2008 17:21:46 -0700 (PDT) Received: from dhcp-wired-209.corp.631h.metaweb.com (dhcp-wired-209.corp.631h.metaweb.com [172.31.20.209]) by zimbra01.corp.sjc1.metaweb.com (Postfix) with ESMTP id 30AF0CDC046 for ; Tue, 30 Sep 2008 17:21:46 -0700 (PDT) Message-ID: <48E2C276.3050906@metaweb.com> Date: Tue, 30 Sep 2008 17:21:10 -0700 From: Colin Evans User-Agent: Thunderbird 2.0.0.17 (Macintosh/20080914) MIME-Version: 1.0 To: core-user@hadoop.apache.org Subject: Re: LZO and native hadoop libraries References: <07023670-CA71-4860-9156-6333B603612B@rapleaf.com> <48E27413.3030700@apache.org> <48E275C7.3050800@metaweb.com> <8D1F1E38-342B-4FFF-BBD1-14EDE8C58721@rapleaf.com> <48E27F17.5020708@metaweb.com> <7ED5454E-5506-4B58-BFD1-2CEBF4223664@rapleaf.com> In-Reply-To: <7ED5454E-5506-4B58-BFD1-2CEBF4223664@rapleaf.com> Content-Type: text/plain; charset=ISO-8859-1; format=flowed Content-Transfer-Encoding: 7bit X-Virus-Checked: Checked by ClamAV on apache.org X-Old-Spam-Flag: NO X-Old-Spam-Status: No, score=-4.399 tagged_above=-10 required=6.6 tests=[ALL_TRUSTED=-1.8, BAYES_00=-2.599] Hi Nathan, This is defined in build/native//config.h. It is generated by autoconf during the build, and if it is missing or incorrect then you probably need to make sure that the LZO libraries and headers are in your search paths and then do a clean build. -Colin Nathan Marz wrote: > Unfortunately, setting those environment variables did not help my > issue. It appears that the "HADOOP_LZO_LIBRARY" variable is not > defined in both LzoCompressor.c and LzoDecompressor.c. Where is this > variable supposed to be set? > > > > On Sep 30, 2008, at 12:33 PM, Colin Evans wrote: > >> Hi Nathan, >> You probably need to add the Java headers to your build path as well >> - I don't know why the Mac doesn't ship with this as a default setting: >> >> export >> CPATH="/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/include >> " >> export >> CPPFLAGS="-I/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/include" >> >> >> >> >> >> Nathan Marz wrote: >>> Thanks for the help. I was able to get past my previous issue, but >>> the native build is still failing. Here is the end of the log output: >>> >>> [exec] then mv -f ".deps/LzoCompressor.Tpo" >>> ".deps/LzoCompressor.Plo"; else rm -f ".deps/LzoCompressor.Tpo"; >>> exit 1; fi >>> [exec] mkdir .libs >>> [exec] gcc -DHAVE_CONFIG_H -I. >>> -I/Users/nathan/Downloads/hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo >>> -I../../../../../../.. -I/Library/Java/Home//include >>> -I/Users/nathan/Downloads/hadoop-0.18.1/src/native/src -g -Wall >>> -fPIC -O2 -m32 -g -O2 -MT LzoCompressor.lo -MD -MP -MF >>> .deps/LzoCompressor.Tpo -c >>> /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c >>> -fno-common -DPIC -o .libs/LzoCompressor.o >>> [exec] >>> /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c: >>> In function >>> 'Java_org_apache_hadoop_io_compress_lzo_LzoCompressor_initIDs': >>> [exec] >>> /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c:135: >>> error: syntax error before ',' token >>> [exec] make[2]: *** [LzoCompressor.lo] Error 1 >>> [exec] make[1]: *** [all-recursive] Error 1 >>> [exec] make: *** [all] Error 2 >>> >>> >>> Any ideas? >>> >>> >>> >>> On Sep 30, 2008, at 11:53 AM, Colin Evans wrote: >>> >>>> There's a patch to get the native targets to build on Mac OS X: >>>> >>>> http://issues.apache.org/jira/browse/HADOOP-3659 >>>> >>>> You probably will need to monkey with LDFLAGS as well to get it to >>>> work, but we've been able to build the native libs for the Mac >>>> without too much trouble. >>>> >>>> >>>> Doug Cutting wrote: >>>>> Arun C Murthy wrote: >>>>>> You need to add libhadoop.so to your java.library.patch. >>>>>> libhadoop.so is available in the corresponding release in the >>>>>> lib/native directory. >>>>> >>>>> I think he needs to first build libhadoop.so, since he appears to >>>>> be running on OS X and we only provide Linux builds of this in >>>>> releases. >>>>> >>>>> Doug >>>> >>> >> >