Return-Path: Delivered-To: apmail-hadoop-core-user-archive@www.apache.org Received: (qmail 29652 invoked from network); 1 Oct 2008 20:36:20 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.2) by minotaur.apache.org with SMTP; 1 Oct 2008 20:36:20 -0000 Received: (qmail 17682 invoked by uid 500); 1 Oct 2008 20:36:14 -0000 Delivered-To: apmail-hadoop-core-user-archive@hadoop.apache.org Received: (qmail 17654 invoked by uid 500); 1 Oct 2008 20:36:14 -0000 Mailing-List: contact core-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: core-user@hadoop.apache.org Delivered-To: mailing list core-user@hadoop.apache.org Received: (qmail 17643 invoked by uid 99); 1 Oct 2008 20:36:14 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 01 Oct 2008 13:36:14 -0700 X-ASF-Spam-Status: No, hits=1.2 required=10.0 tests=SPF_NEUTRAL X-Spam-Check-By: apache.org Received-SPF: neutral (athena.apache.org: local policy) Received: from [69.147.107.21] (HELO mrout2-b.corp.re1.yahoo.com) (69.147.107.21) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 01 Oct 2008 20:35:11 +0000 Received: from walkduty-lm.corp.yahoo.com (walkduty-lm.corp.yahoo.com [10.72.104.13]) by mrout2-b.corp.re1.yahoo.com (8.13.8/8.13.8/y.out) with ESMTP id m91KYjC7086342 for ; Wed, 1 Oct 2008 13:34:45 -0700 (PDT) DomainKey-Signature: a=rsa-sha1; s=serpent; d=yahoo-inc.com; c=nofws; q=dns; h=message-id:from:to:in-reply-to:content-type: content-transfer-encoding:mime-version:subject:date:references:x-mailer; b=QtKxwgFBVS1AkE/gJ+lO6wgEzXxVas/pRxo4mYPJBsTmTSb2GEC9sgLpz8FzUo2q Message-Id: <992C73C2-EA89-48A9-8031-2E4C5DACA065@yahoo-inc.com> From: Arun C Murthy To: core-user@hadoop.apache.org In-Reply-To: <47886A65-8E1A-4345-A0C4-9C17FD8F1965@rapleaf.com> Content-Type: text/plain; charset=US-ASCII; format=flowed; delsp=yes Content-Transfer-Encoding: 7bit Mime-Version: 1.0 (Apple Message framework v929.2) Subject: Re: LZO and native hadoop libraries Date: Wed, 1 Oct 2008 13:34:44 -0700 References: <07023670-CA71-4860-9156-6333B603612B@rapleaf.com> <48E27413.3030700@apache.org> <48E275C7.3050800@metaweb.com> <8D1F1E38-342B-4FFF-BBD1-14EDE8C58721@rapleaf.com> <48E27F17.5020708@metaweb.com> <7ED5454E-5506-4B58-BFD1-2CEBF4223664@rapleaf.com> <48E2F24F.6010908@yahoo-inc.com> <47886A65-8E1A-4345-A0C4-9C17FD8F1965@rapleaf.com> X-Mailer: Apple Mail (2.929.2) X-Virus-Checked: Checked by ClamAV on apache.org On Oct 1, 2008, at 12:54 PM, Nathan Marz wrote: > Yes, this is exactly what I'm seeing. To be honest, I don't know > which LZO native library it should be looking for. The LZO install > dropped "liblzo2.la" and "liblzo2.a" in my /usr/local/lib directory, > but not a file with a ".so" extension. Hardcoding would be fine as a > temporary solution, but I don't know what to hardcode. > You do need liblzo2.so for this to work. The hardcoded value has to be liblzo2.so too ... Arun > Thanks, > Nathan > > > On Sep 30, 2008, at 8:45 PM, Amareshwari Sriramadasu wrote: > >> Are you seeing HADOOP-2009? >> >> Thanks >> Amareshwari >> Nathan Marz wrote: >>> Unfortunately, setting those environment variables did not help my >>> issue. It appears that the "HADOOP_LZO_LIBRARY" variable is not >>> defined in both LzoCompressor.c and LzoDecompressor.c. Where is >>> this variable supposed to be set? >>> >>> >>> >>> On Sep 30, 2008, at 12:33 PM, Colin Evans wrote: >>> >>>> Hi Nathan, >>>> You probably need to add the Java headers to your build path as >>>> well - I don't know why the Mac doesn't ship with this as a >>>> default setting: >>>> >>>> export CPATH="/System/Library/Frameworks/JavaVM.framework/ >>>> Versions/CurrentJDK/Home/include " >>>> export CPPFLAGS="-I/System/Library/Frameworks/JavaVM.framework/ >>>> Versions/CurrentJDK/Home/include" >>>> >>>> >>>> >>>> >>>> Nathan Marz wrote: >>>>> Thanks for the help. I was able to get past my previous issue, >>>>> but the native build is still failing. Here is the end of the >>>>> log output: >>>>> >>>>> [exec] then mv -f ".deps/LzoCompressor.Tpo" ".deps/ >>>>> LzoCompressor.Plo"; else rm -f ".deps/LzoCompressor.Tpo"; exit >>>>> 1; fi >>>>> [exec] mkdir .libs >>>>> [exec] gcc -DHAVE_CONFIG_H -I. -I/Users/nathan/Downloads/ >>>>> hadoop-0.18.1/src/native/src/org/apache/hadoop/io/compress/lzo - >>>>> I../../../../../../.. -I/Library/Java/Home//include -I/Users/ >>>>> nathan/Downloads/hadoop-0.18.1/src/native/src -g -Wall -fPIC -O2 >>>>> -m32 -g -O2 -MT LzoCompressor.lo -MD -MP -MF .deps/ >>>>> LzoCompressor.Tpo -c /Users/nathan/Downloads/hadoop-0.18.1/src/ >>>>> native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c - >>>>> fno-common -DPIC -o .libs/LzoCompressor.o >>>>> [exec] /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/ >>>>> org/apache/hadoop/io/compress/lzo/LzoCompressor.c: In function >>>>> 'Java_org_apache_hadoop_io_compress_lzo_LzoCompressor_initIDs': >>>>> [exec] /Users/nathan/Downloads/hadoop-0.18.1/src/native/src/ >>>>> org/apache/hadoop/io/compress/lzo/LzoCompressor.c:135: error: >>>>> syntax error before ',' token >>>>> [exec] make[2]: *** [LzoCompressor.lo] Error 1 >>>>> [exec] make[1]: *** [all-recursive] Error 1 >>>>> [exec] make: *** [all] Error 2 >>>>> >>>>> >>>>> Any ideas? >>>>> >>>>> >>>>> >>>>> On Sep 30, 2008, at 11:53 AM, Colin Evans wrote: >>>>> >>>>>> There's a patch to get the native targets to build on Mac OS X: >>>>>> >>>>>> http://issues.apache.org/jira/browse/HADOOP-3659 >>>>>> >>>>>> You probably will need to monkey with LDFLAGS as well to get it >>>>>> to work, but we've been able to build the native libs for the >>>>>> Mac without too much trouble. >>>>>> >>>>>> >>>>>> Doug Cutting wrote: >>>>>>> Arun C Murthy wrote: >>>>>>>> You need to add libhadoop.so to your java.library.patch. >>>>>>>> libhadoop.so is available in the corresponding release in the >>>>>>>> lib/native directory. >>>>>>> >>>>>>> I think he needs to first build libhadoop.so, since he appears >>>>>>> to be running on OS X and we only provide Linux builds of this >>>>>>> in releases. >>>>>>> >>>>>>> Doug >>>>>> >>>>> >>>> >>> >> >