Return-Path: Delivered-To: apmail-lucene-hadoop-commits-archive@locus.apache.org Received: (qmail 42797 invoked from network); 4 Jan 2008 19:53:17 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.2) by minotaur.apache.org with SMTP; 4 Jan 2008 19:53:17 -0000 Received: (qmail 12762 invoked by uid 500); 4 Jan 2008 19:53:06 -0000 Delivered-To: apmail-lucene-hadoop-commits-archive@lucene.apache.org Received: (qmail 12636 invoked by uid 500); 4 Jan 2008 19:53:06 -0000 Mailing-List: contact hadoop-commits-help@lucene.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hadoop-dev@lucene.apache.org Delivered-To: mailing list hadoop-commits@lucene.apache.org Received: (qmail 12626 invoked by uid 99); 4 Jan 2008 19:53:06 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 04 Jan 2008 11:53:06 -0800 X-ASF-Spam-Status: No, hits=-100.0 required=10.0 tests=ALL_TRUSTED,OBSCURED_EMAIL X-Spam-Check-By: apache.org Received: from [140.211.11.3] (HELO eris.apache.org) (140.211.11.3) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 04 Jan 2008 19:52:45 +0000 Received: by eris.apache.org (Postfix, from userid 65534) id A75E91A9846; Fri, 4 Jan 2008 11:52:49 -0800 (PST) Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 8bit Subject: svn commit: r608973 [4/4] - in /lucene/hadoop/trunk: ./ docs/ docs/skin/ docs/skin/images/ src/docs/src/documentation/content/xdocs/ Date: Fri, 04 Jan 2008 19:52:42 -0000 To: hadoop-commits@lucene.apache.org From: acmurthy@apache.org X-Mailer: svnmailer-1.0.8 Message-Id: <20080104195249.A75E91A9846@eris.apache.org> X-Virus-Checked: Checked by ClamAV on apache.org Added: lucene/hadoop/trunk/docs/native_libraries.html URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/docs/native_libraries.html?rev=608973&view=auto ============================================================================== --- lucene/hadoop/trunk/docs/native_libraries.html (added) +++ lucene/hadoop/trunk/docs/native_libraries.html Fri Jan 4 11:52:39 2008 @@ -0,0 +1,404 @@ + + + + + + + +Native Hadoop Libraries + + + + + + + + + +
+ + + +
+ + + + + + + + + + + + +
+
+
+
+ +
+ + +
+ +
+ +   +
+ + + + + +
+ +

Native Hadoop Libraries

+ + + + +

Purpose

+
+

Hadoop has native implementations of certain components for reasons of + both performace & non-availability of Java implementations. These + components are available in a single, dynamically-linked, native library. + On the *nix platform it is libhadoop.so. This document describes + the usage & details on how to build the native libraries.

+
+ + + +

Components

+
+

Hadoop currently has the following + + compression codecs as the native components:

+ +

Of the above, the availability of native hadoop libraries is imperative + for the lzo and gzip compression codecs to work.

+
+ + + +

Usage

+
+

It is fairly simple to use the native hadoop libraries:

+
    + +
  • + Take a look at the + supported platforms. +
  • + +
  • + Either download the pre-built + 32-bit i386-Linux native hadoop libraries (available as part of hadoop + distribution in lib/native directory) or + build them yourself. +
  • + +
  • + Ensure you have either or both of >zlib-1.2 and + >lzo2.0 packages for your platform installed; + depending on your needs. +
  • + +
+

The bin/hadoop script ensures that the native hadoop + library is on the library path via the system property + -Djava.library.path=<path>.

+

To check everything went alright check the hadoop log files for:

+

+ + + DEBUG util.NativeCodeLoader - Trying to load the custom-built + native-hadoop library... + +
+ + + INFO util.NativeCodeLoader - Loaded the native-hadoop library + + +

+

If something goes wrong, then:

+

+ + + INFO util.NativeCodeLoader - Unable to load native-hadoop library for + your platform... using builtin-java classes where applicable + + +

+
+ + + +

Supported Platforms

+
+

Hadoop native library is supported only on *nix platforms only. + Unfortunately it is known not to work on Cygwin + and Mac OS X and has mainly been used on the + GNU/Linux platform.

+

It has been tested on the following GNU/Linux distributions:

+ +

On all the above platforms a 32/64 bit Hadoop native library will work + with a respective 32/64 bit jvm.

+
+ + + +

Building Native Hadoop Libraries

+
+

Hadoop native library is written in + ANSI C and built using + the GNU autotools-chain (autoconf, autoheader, automake, autoscan, libtool). + This means it should be straight-forward to build them on any platform with + a standards compliant C compiler and the GNU autotools-chain. + See supported platforms.

+

In particular the various packages you would need on the target + platform are:

+
    + +
  • + C compiler (e.g. GNU C Compiler) +
  • + +
  • + GNU Autools Chain: + autoconf, + automake, + libtool + +
  • + +
  • + zlib-development package (stable version >= 1.2.0) +
  • + +
  • + lzo-development package (stable version >= 2.0) +
  • + +
+

Once you have the pre-requisites use the standard build.xml + and pass along the compile.native flag (set to + true) to build the native hadoop library:

+

+$ ant -Dcompile.native=true <target> +

+

The native hadoop library is not built by default since not everyone is + interested in building them.

+

You should see the newly-built native hadoop library in:

+

+$ build/native/<platform>/lib +

+

where <platform> is combination of the system-properties: + ${os.name}-${os.arch}-${sun.arch.data.model}; for e.g. + Linux-i386-32.

+ +

Notes

+
    + +
  • + It is mandatory to have both the zlib and lzo + development packages on the target platform for building the + native hadoop library; however for deployment it is sufficient to + install zlib or lzo if you wish to use only one of them. +
  • + +
  • + It is necessary to have the correct 32/64 libraries of both zlib/lzo + depending on the 32/64 bit jvm for the target platform for + building/deployment of the native hadoop library. +
  • + +
+
+ +
+ +
 
+
+ + + Added: lucene/hadoop/trunk/docs/native_libraries.pdf URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/docs/native_libraries.pdf?rev=608973&view=auto ============================================================================== --- lucene/hadoop/trunk/docs/native_libraries.pdf (added) +++ lucene/hadoop/trunk/docs/native_libraries.pdf Fri Jan 4 11:52:39 2008 @@ -0,0 +1,612 @@ +%PDF-1.3 +%ª«¬­ +4 0 obj +<< /Type /Info +/Producer (FOP 0.20.5) >> +endobj +5 0 obj +<< /Length 565 /Filter [ /ASCII85Decode /FlateDecode ] + >> +stream +Gb!$D4`>s,&;GCXKu[`&lYg0IW^]ug7B&VO[K?ht+;%9V8_&;t@o2N&.-V;>Aq7e>*R>b=2i53%.0YAj"JX+-m:kZ*YkBb9qS44img4^k'`j-!i,(H+L^[cl$:tapL9VcfHb$7hg"#ebjDD(,2TY59@a$"\)<`cZj8E0%l[R@FSic-sI24aL"Uo\i2TY9306U>AHu5B3Y2sCe)2lZ?sNefpfgI(!Ze(/CZ]OM+;FAks-[[UbOQH_&Y;,Q,--IALe!$5[.TE~> +endstream +endobj +6 0 obj +<< /Type /Page +/Parent 1 0 R +/MediaBox [ 0 0 612 792 ] +/Resources 3 0 R +/Contents 5 0 R +/Annots 7 0 R +>> +endobj +7 0 obj +[ +8 0 R +10 0 R +12 0 R +14 0 R +16 0 R +18 0 R +] +endobj +8 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 102.0 559.666 148.664 547.666 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A 9 0 R +/H /I +>> +endobj +10 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 102.0 541.466 170.672 529.466 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A 11 0 R +/H /I +>> +endobj +12 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 102.0 523.266 139.988 511.266 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A 13 0 R +/H /I +>> +endobj +14 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 102.0 505.066 209.0 493.066 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A 15 0 R +/H /I +>> +endobj +16 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 102.0 486.866 274.316 474.866 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A 17 0 R +/H /I +>> +endobj +18 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 108.0 468.666 151.496 456.666 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A 19 0 R +/H /I +>> +endobj +20 0 obj +<< /Length 2227 /Filter [ /ASCII85Decode /FlateDecode ] + >> +stream +GatU5>EbO7'Roe[i2Mq-`*+GSBq(OQZJ/nF8_Q."9cpU%D.PGrJCediI1Jla&I,GFS]pG&Jogt;NP&B;Y&=B"]#;#ml^;U,B%4-F+13:3mX3V0eFBZ[cMIDE-!/ZfH1h-"O8C,f-,6bbYnh7jY8=NpJZpK8SLHnG-)'d3DY*_[[n16Jgfi#&S@tk4H5:'aVC@R'iW`$1+)rK9hM^1/AUhe\9&0%;/9Z<]GPKs^\Jqo?(k%LLTBAhojrpT[9LV+=$t*#J5_?CbDn?j:qJ'GmDGB7CGe`dkPpVaKmo@lbNg,I"Ct9k&r/50C!^AE5`M#6,&:98j,O;k\^&KPQ_gV_2_*IE;p,WqaZUp\9QDd1;%;G4#$mC*6:9FZYM(,Y0&cS!1IZ4X\p=Xe^-Y!K0LdGL9dtpt(@[#4Do-3&WiLH82HEc=U[s>e(geM=CUp^i@u]jDO;B.aT!V6O$;#3A_JU&Q)CU;,iWE<:S&_Mp7T&'Q2r@&QOij@f]ZX<0=rM(>R8R2F4AoaP*jmJqhrP-tPa2<[TRH:HH3.jO;Vq#C;\'sd\O301#8O*dNb7<;4dm/h(<0`jW7)U/Rm4a-no>n;3 ZNOsTeaA(YrkJ,W2RqeuM8FFG*37IV"Ra2N"`emA'?`^>aZ\:>*d-DISLdb\%pPn[@;a64e/bc)36RPI/*ldipE"?"5(q%5b]=A%Y4VsoM?33l73S8/4!;&#h!\ISG\M$RHXrkq*#DUJFb87stgZ;(/md[.]EcGbI]XU]gm*Y>4]nT&7o7k'k..L^V1":GS%srBM8C#2C_PH:^FA3:<'NP%ML!FFjJft-N4!t;+@2',,bi@4o[,\.S(NmC1(?_qr@]0-8dXQ6POc6GMr]KV_uHKUh$>4t"kA3%aM*oWo.ThOqkqarMRsAg;;7%u&4ck%9uD(i:06cp$#$T//Xh4;dF9Dbe2;@;9l29VMoO-q+*:rBVmZge@IHl<%1kG%'&! 2`qX3-J/re&[EMU<.5i3Ta8e_Jc%Xi>gVdcJ2f03d^UlP7f(!7)a<^7Zl=el>\G==FrT^hj%6JqUMX*=M!MgW(iX\YG1?+'r=8\4HT#bNYT)M%($opobZ7U,"$NM>@@]>a9'u2j0]-0Pj_bK$PlUYlD;^c?,[/fB(BLSm8!`ITKb#-q,3"*U!r&n-b6AiqJ3gV\WQb?C&UnY+L3ik%uoogSU&>8H_->UGQ.]kr +endstream +endobj +21 0 obj +<< /Type /Page +/Parent 1 0 R +/MediaBox [ 0 0 612 792 ] +/Resources 3 0 R +/Contents 20 0 R +/Annots 22 0 R +>> +endobj +22 0 obj +[ +23 0 R +24 0 R +25 0 R +26 0 R +27 0 R +29 0 R +30 0 R +] +endobj +23 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 263.64 537.732 359.952 525.732 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A << /URI (api/org/apache/hadoop/io/compress/CompressionCodec.html) +/S /URI >> +/H /I +>> +endobj +24 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 108.0 520.532 126.0 508.532 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A << /URI (http://www.zlib.net/) +/S /URI >> +/H /I +>> +endobj +25 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 108.0 507.332 128.664 495.332 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A << /URI (http://www.gzip.org/) +/S /URI >> +/H /I +>> +endobj +26 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 108.0 494.132 122.664 482.132 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A << /URI (http://www.oberhumer.com/opensource/lzo/) +/S /URI >> +/H /I +>> +endobj +27 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 196.98 388.198 293.304 376.198 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A 28 0 R +/H /I +>> +endobj +29 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 140.328 374.998 187.656 362.998 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A << /URI (http://lucene.apache.org/hadoop/releases.html#Download) +/S /URI >> +/H /I +>> +endobj +30 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 368.316 361.798 392.988 349.798 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A 31 0 R +/H /I +>> +endobj +32 0 obj +<< /Length 2386 /Filter [ /ASCII85Decode /FlateDecode ] + >> +stream +Gau0E=``=U&:W67i'SRF:'?b'GbmuDP-fD\8WtuSB%j6]!(i9jRN`IfQBb5EHPa?!i#tbXFSnHl`9t*m4hcVW>u/a!4MtXH1p3/Ydo]Ec#:?'P)O.oSebq9a_`lAg&p'i]Ea:1Tm`6W;!/>UBRK)-sWtMX'O@+`DJusQlZn.'fRL"U?C0Rk0*BWl>LH-f8ZXuQj_%1fj/RgFX$XiU-n6m9<7WI=UAoD-!P0(+M#`X*W!=,E]oj2SKM.-GA[/obZ.IXHfLO?&1q4nc;8SqO%HabWOUIN69M5c=H554'`/MRhk^CJk$9%RHL$o.\Rr)J=ME"CpJ]"ZV&)JXR"MofX!3K<>$ZF20t$>)GGXPt(h[@&@3gBIXkB(Y"iNQYNA5#WVr<=/M2puO0[Fa4%EE]%FWg"<&%^#K?9gaH+o`Eq=,s!^XV><^bP,(Va0^# 38?K^:XaLdI=%(t\VMo`YFt5oVH63Ah_4$0#d"td/$2>^eL/\S^Agp\0`#!>A+hc.-2$;Qo)klE8EfK@i'UQEO;^=W4b@3eaPH?&/!#HWWq+D?Bt5Bf6a#r1"a5`Dck&<'p1k]S!1ou^.f]b,k3TV63^907GFV`,-%mZVt'/##W60bK;o2d6mTD.aR6LiY'aD&?f`[U8:%W%*\*/PktW?:6mr3TE3"r(3%0*V3LL&nkjK'Y"`iR1bFQfP]!`Q,jJ)3)@MGr76UoB3B,5j:(^RMOMU,\0-fE_9HIU!?Oe>X">TcT0Z($P@eK?o0ZFW(`Z@*u.7!(56ABr7,pl>hi+BgNTPR7f@DP`?FNMK:q6Y[NT\eMD5Otk:PH&imb+[-&-4#i=(D#-pX<06N@_36].OReX^LK;X2QS]?!Bd-PgAfq\OIPkpc~> +endstream +endobj +33 0 obj +<< /Type /Page +/Parent 1 0 R +/MediaBox [ 0 0 612 792 ] +/Resources 3 0 R +/Contents 32 0 R +/Annots 34 0 R +>> +endobj +34 0 obj +[ +35 0 R +36 0 R +37 0 R +38 0 R +39 0 R +40 0 R +41 0 R +42 0 R +43 0 R +44 0 R +45 0 R +46 0 R +] +endobj +35 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 163.332 616.466 201.336 604.466 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A << /URI (http://www.cygwin.com/) +/S /URI >> +/H /I +>> +endobj +36 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 224.664 616.466 275.988 604.466 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A << /URI (http://www.apple.com/macosx) +/S /URI >> +/H /I +>> +endobj +37 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 108.0 564.866 145.332 552.866 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A << /URI (http://www.redhat.com/rhel/) +/S /URI >> +/H /I +>> +endobj +38 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 148.668 564.866 181.992 552.866 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A << /URI (http://fedora.redhat.com/) +/S /URI >> +/H /I +>> +endobj +39 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 108.0 551.666 144.0 539.666 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A << /URI (http://www.ubuntu.com/) +/S /URI >> +/H /I +>> +endobj +40 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 108.0 538.466 143.328 526.466 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A << /URI (http://www.gentoo.org/) +/S /URI >> +/H /I +>> +endobj +41 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 258.648 449.732 297.648 437.732 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A << /URI (http://en.wikipedia.org/wiki/ANSI_C) +/S /URI >> +/H /I +>> +endobj +42 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 190.32 410.132 286.644 398.132 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A 28 0 R +/H /I +>> +endobj +43 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 188.988 371.732 274.32 359.732 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A << /URI (http://gcc.gnu.org/) +/S /URI >> +/H /I +>> +endobj +44 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 213.0 358.532 254.988 346.532 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A << /URI (http://www.gnu.org/software/autoconf/) +/S /URI >> +/H /I +>> +endobj +45 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 260.988 358.532 307.644 346.532 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A << /URI (http://www.gnu.org/software/automake/) +/S /URI >> +/H /I +>> +endobj +46 0 obj +<< /Type /Annot +/Subtype /Link +/Rect [ 313.644 358.532 344.988 346.532 ] +/C [ 0 0 0 ] +/Border [ 0 0 0 ] +/A << /URI (http://www.gnu.org/software/libtool/) +/S /URI >> +/H /I +>> +endobj +47 0 obj +<< /Length 798 /Filter [ /ASCII85Decode /FlateDecode ] + >> +stream +Gat%!9okbt&A@ZcI,$ld@iL@m4a!g&+0&ADQ7WqAg(RCD4f539?19(0Nc-.`hU"S&EmQ?C*'@`]8W31_?M\*I(c/9m?N_*>fH51RNXWb>]!^<4.e*oU(S$gg7o;hqPkQS'IfC7!%;Yn_E+FaIDS?fYN:CmER\lYIN'a!5jK2!jmC_8sh=d8SNm_N*`8mV@no&bdEkK-nu\m%J#&isJ"guuS.]6:EWKg(D8[(6&A5E<_1~> +endstream +endobj +48 0 obj +<< /Type /Page +/Parent 1 0 R +/MediaBox [ 0 0 612 792 ] +/Resources 3 0 R +/Contents 47 0 R +>> +endobj +50 0 obj +<< + /Title (\376\377\0\61\0\40\0\120\0\165\0\162\0\160\0\157\0\163\0\145) + /Parent 49 0 R + /Next 51 0 R + /A 9 0 R +>> endobj +51 0 obj +<< + /Title (\376\377\0\62\0\40\0\103\0\157\0\155\0\160\0\157\0\156\0\145\0\156\0\164\0\163) + /Parent 49 0 R + /Prev 50 0 R + /Next 52 0 R + /A 11 0 R +>> endobj +52 0 obj +<< + /Title (\376\377\0\63\0\40\0\125\0\163\0\141\0\147\0\145) + /Parent 49 0 R + /Prev 51 0 R + /Next 53 0 R + /A 13 0 R +>> endobj +53 0 obj +<< + /Title (\376\377\0\64\0\40\0\123\0\165\0\160\0\160\0\157\0\162\0\164\0\145\0\144\0\40\0\120\0\154\0\141\0\164\0\146\0\157\0\162\0\155\0\163) + /Parent 49 0 R + /Prev 52 0 R + /Next 54 0 R + /A 15 0 R +>> endobj +54 0 obj +<< + /Title (\376\377\0\65\0\40\0\102\0\165\0\151\0\154\0\144\0\151\0\156\0\147\0\40\0\116\0\141\0\164\0\151\0\166\0\145\0\40\0\110\0\141\0\144\0\157\0\157\0\160\0\40\0\114\0\151\0\142\0\162\0\141\0\162\0\151\0\145\0\163) + /Parent 49 0 R + /First 55 0 R + /Last 55 0 R + /Prev 53 0 R + /Count -1 + /A 17 0 R +>> endobj +55 0 obj +<< + /Title (\376\377\0\65\0\56\0\61\0\40\0\116\0\157\0\164\0\145\0\163) + /Parent 54 0 R + /A 19 0 R +>> endobj +56 0 obj +<< /Type /Font +/Subtype /Type1 +/Name /F3 +/BaseFont /Helvetica-Bold +/Encoding /WinAnsiEncoding >> +endobj +57 0 obj +<< /Type /Font +/Subtype /Type1 +/Name /F5 +/BaseFont /Times-Roman +/Encoding /WinAnsiEncoding >> +endobj +58 0 obj +<< /Type /Font +/Subtype /Type1 +/Name /F6 +/BaseFont /Times-Italic +/Encoding /WinAnsiEncoding >> +endobj +59 0 obj +<< /Type /Font +/Subtype /Type1 +/Name /F1 +/BaseFont /Helvetica +/Encoding /WinAnsiEncoding >> +endobj +60 0 obj +<< /Type /Font +/Subtype /Type1 +/Name /F9 +/BaseFont /Courier +/Encoding /WinAnsiEncoding >> +endobj +61 0 obj +<< /Type /Font +/Subtype /Type1 +/Name /F2 +/BaseFont /Helvetica-Oblique +/Encoding /WinAnsiEncoding >> +endobj +62 0 obj +<< /Type /Font +/Subtype /Type1 +/Name /F7 +/BaseFont /Times-Bold +/Encoding /WinAnsiEncoding >> +endobj +1 0 obj +<< /Type /Pages +/Count 4 +/Kids [6 0 R 21 0 R 33 0 R 48 0 R ] >> +endobj +2 0 obj +<< /Type /Catalog +/Pages 1 0 R + /Outlines 49 0 R + /PageMode /UseOutlines + >> +endobj +3 0 obj +<< +/Font << /F3 56 0 R /F5 57 0 R /F1 59 0 R /F6 58 0 R /F9 60 0 R /F2 61 0 R /F7 62 0 R >> +/ProcSet [ /PDF /ImageC /Text ] >> +endobj +9 0 obj +<< +/S /GoTo +/D [21 0 R /XYZ 85.0 659.0 null] +>> +endobj +11 0 obj +<< +/S /GoTo +/D [21 0 R /XYZ 85.0 567.066 null] +>> +endobj +13 0 obj +<< +/S /GoTo +/D [21 0 R /XYZ 85.0 434.732 null] +>> +endobj +15 0 obj +<< +/S /GoTo +/D [33 0 R /XYZ 85.0 659.0 null] +>> +endobj +17 0 obj +<< +/S /GoTo +/D [33 0 R /XYZ 85.0 479.066 null] +>> +endobj +19 0 obj +<< +/S /GoTo +/D [48 0 R /XYZ 85.0 659.0 null] +>> +endobj +28 0 obj +<< +/S /GoTo +/D [null /XYZ 0.0 0.0 null] +>> +endobj +31 0 obj +<< +/S /GoTo +/D [null /XYZ 0.0 0.0 null] +>> +endobj +49 0 obj +<< + /First 50 0 R + /Last 54 0 R +>> endobj +xref +0 63 +0000000000 65535 f +0000013164 00000 n +0000013243 00000 n +0000013335 00000 n +0000000015 00000 n +0000000071 00000 n +0000000727 00000 n +0000000847 00000 n +0000000907 00000 n +0000013480 00000 n +0000001042 00000 n +0000013543 00000 n +0000001179 00000 n +0000013609 00000 n +0000001316 00000 n +0000013675 00000 n +0000001451 00000 n +0000013739 00000 n +0000001588 00000 n +0000013805 00000 n +0000001725 00000 n +0000004045 00000 n +0000004168 00000 n +0000004237 00000 n +0000004445 00000 n +0000004615 00000 n +0000004787 00000 n +0000004979 00000 n +0000013869 00000 n +0000005117 00000 n +0000005325 00000 n +0000013928 00000 n +0000005464 00000 n +0000007943 00000 n +0000008066 00000 n +0000008170 00000 n +0000008346 00000 n +0000008527 00000 n +0000008706 00000 n +0000008885 00000 n +0000009057 00000 n +0000009231 00000 n +0000009420 00000 n +0000009558 00000 n +0000009730 00000 n +0000009919 00000 n +0000010110 00000 n +0000010300 00000 n +0000011190 00000 n +0000013987 00000 n +0000011298 00000 n +0000011431 00000 n +0000011597 00000 n +0000011733 00000 n +0000011952 00000 n +0000012273 00000 n +0000012391 00000 n +0000012504 00000 n +0000012614 00000 n +0000012725 00000 n +0000012833 00000 n +0000012939 00000 n +0000013055 00000 n +trailer +<< +/Size 63 +/Root 2 0 R +/Info 4 0 R +>> +startxref +14038 +%%EOF Modified: lucene/hadoop/trunk/docs/quickstart.html URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/docs/quickstart.html?rev=608973&r1=608972&r2=608973&view=diff ============================================================================== --- lucene/hadoop/trunk/docs/quickstart.html (original) +++ lucene/hadoop/trunk/docs/quickstart.html Fri Jan 4 11:52:39 2008 @@ -1,9 +1,9 @@ - + - + Hadoop Quickstart @@ -16,46 +16,91 @@
+
+
+
- +  
+ +
+
+
+
- +  
+ + Modified: lucene/hadoop/trunk/src/docs/src/documentation/content/xdocs/index.xml URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/docs/src/documentation/content/xdocs/index.xml?rev=608973&r1=608972&r2=608973&view=diff ============================================================================== --- lucene/hadoop/trunk/src/docs/src/documentation/content/xdocs/index.xml (original) +++ lucene/hadoop/trunk/src/docs/src/documentation/content/xdocs/index.xml Fri Jan 4 11:52:39 2008 @@ -19,6 +19,7 @@
  • Hadoop Cluster Setup
  • Hadoop Distributed File System
  • Hadoop Map-Reduce Tutorial
  • +
  • Native Hadoop Libraries
  • API Docs
  • Wiki
  • FAQ
  • Modified: lucene/hadoop/trunk/src/docs/src/documentation/content/xdocs/mapred_tutorial.xml URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/docs/src/documentation/content/xdocs/mapred_tutorial.xml?rev=608973&r1=608972&r2=608973&view=diff ============================================================================== --- lucene/hadoop/trunk/src/docs/src/documentation/content/xdocs/mapred_tutorial.xml (original) +++ lucene/hadoop/trunk/src/docs/src/documentation/content/xdocs/mapred_tutorial.xml Fri Jan 4 11:52:39 2008 @@ -1109,8 +1109,14 @@ individual task.

    - TextInputFormat is the default InputFormat. -

    + TextInputFormat is the default InputFormat.

    + +

    If TextInputFormat is the InputFormat for a + given job, the framework detects input-files with the .gz and + .lzo extensions and automatically decompresses them using the + appropriate CompressionCodec. However, it must be noted that + compressed files with the above extensions cannot be split and + each compressed file is processed in its entirety by a single mapper.

    InputSplit @@ -1336,6 +1342,70 @@ JobControl is a utility which encapsulates a set of Map-Reduce jobs and their dependencies.

    + +
    + Data Compression + +

    Hadoop Map-Reduce provides facilities for the application-writer to + specify compression for both intermediate map-outputs and the + job-outputs i.e. output of the reduces. It also comes bundled with + + CompressionCodec implementations for the + zlib and lzo compression + algorithms. The gzip file format is also + supported.

    + +

    Hadoop also provides native implementations of the above compression + codecs for reasons of both performance (zlib) and non-availability of + Java libraries (lzo). More details on their usage and availability are + available here.

    + +
    + Intermediate Outputs + +

    Applications can control compression of intermediate map-outputs + via the + + JobConf.setCompressMapOutput(boolean) api and the + CompressionCodec to be used via the + + JobConf.setMapOutputCompressorClass(Class) api. Since + the intermediate map-outputs are always stored in the + SequenceFile + format, the + + SequenceFile.CompressionType (i.e. + + RECORD / + + BLOCK - defaults to RECORD) can be specified via the + + JobConf.setMapOutputCompressionType(SequenceFile.CompressionType) + api.

    +
    + +
    + Job Outputs + +

    Applications can control compression of job-outputs via the + + OutputFormatBase.setCompressOutput(JobConf, boolean) api and the + CompressionCodec to be used can be specified via the + + OutputFormatBase.setOutputCompressorClass(JobConf, Class) api.

    + +

    If the job outputs are to be stored in the + + SequenceFileOutputFormat, the required + SequenceFile.CompressionType (i.e. RECORD / + BLOCK - defaults to RECORD)can be specified + via the + + SequenceFileOutputFormat.setOutputCompressionType(JobConf, + SequenceFile.CompressionType) api.

    +
    +
    + Added: lucene/hadoop/trunk/src/docs/src/documentation/content/xdocs/native_libraries.xml URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/docs/src/documentation/content/xdocs/native_libraries.xml?rev=608973&view=auto ============================================================================== --- lucene/hadoop/trunk/src/docs/src/documentation/content/xdocs/native_libraries.xml (added) +++ lucene/hadoop/trunk/src/docs/src/documentation/content/xdocs/native_libraries.xml Fri Jan 4 11:52:39 2008 @@ -0,0 +1,175 @@ + + + + + + +
    + Native Hadoop Libraries +
    + + + +
    + Purpose + +

    Hadoop has native implementations of certain components for reasons of + both performace & non-availability of Java implementations. These + components are available in a single, dynamically-linked, native library. + On the *nix platform it is libhadoop.so. This document describes + the usage & details on how to build the native libraries.

    +
    + +
    + Components + +

    Hadoop currently has the following + + compression codecs as the native components:

    + + +

    Of the above, the availability of native hadoop libraries is imperative + for the lzo and gzip compression codecs to work.

    +
    + +
    + Usage + +

    It is fairly simple to use the native hadoop libraries:

    + +
      +
    • + Take a look at the + supported platforms. +
    • +
    • + Either download the pre-built + 32-bit i386-Linux native hadoop libraries (available as part of hadoop + distribution in lib/native directory) or + build them yourself. +
    • +
    • + Ensure you have either or both of >zlib-1.2 and + >lzo2.0 packages for your platform installed; + depending on your needs. +
    • +
    + +

    The bin/hadoop script ensures that the native hadoop + library is on the library path via the system property + -Djava.library.path=<path>.

    + +

    To check everything went alright check the hadoop log files for:

    + +

    + + DEBUG util.NativeCodeLoader - Trying to load the custom-built + native-hadoop library... +
    + + INFO util.NativeCodeLoader - Loaded the native-hadoop library + +

    + +

    If something goes wrong, then:

    +

    + + INFO util.NativeCodeLoader - Unable to load native-hadoop library for + your platform... using builtin-java classes where applicable + +

    +
    + +
    + Supported Platforms + +

    Hadoop native library is supported only on *nix platforms only. + Unfortunately it is known not to work on Cygwin + and Mac OS X and has mainly been used on the + GNU/Linux platform.

    + +

    It has been tested on the following GNU/Linux distributions:

    + + +

    On all the above platforms a 32/64 bit Hadoop native library will work + with a respective 32/64 bit jvm.

    +
    + +
    + Building Native Hadoop Libraries + +

    Hadoop native library is written in + ANSI C and built using + the GNU autotools-chain (autoconf, autoheader, automake, autoscan, libtool). + This means it should be straight-forward to build them on any platform with + a standards compliant C compiler and the GNU autotools-chain. + See supported platforms.

    + +

    In particular the various packages you would need on the target + platform are:

    +
      +
    • + C compiler (e.g. GNU C Compiler) +
    • +
    • + GNU Autools Chain: + autoconf, + automake, + libtool +
    • +
    • + zlib-development package (stable version >= 1.2.0) +
    • +
    • + lzo-development package (stable version >= 2.0) +
    • +
    + +

    Once you have the pre-requisites use the standard build.xml + and pass along the compile.native flag (set to + true) to build the native hadoop library:

    + +

    $ ant -Dcompile.native=true <target>

    + +

    The native hadoop library is not built by default since not everyone is + interested in building them.

    + +

    You should see the newly-built native hadoop library in:

    + +

    $ build/native/<platform>/lib

    + +

    where <platform> is combination of the system-properties: + ${os.name}-${os.arch}-${sun.arch.data.model}; for e.g. + Linux-i386-32.

    + +
    + Notes + +
      +
    • + It is mandatory to have both the zlib and lzo + development packages on the target platform for building the + native hadoop library; however for deployment it is sufficient to + install zlib or lzo if you wish to use only one of them. +
    • +
    • + It is necessary to have the correct 32/64 libraries of both zlib/lzo + depending on the 32/64 bit jvm for the target platform for + building/deployment of the native hadoop library. +
    • +
    +
    +
    + + +
    Modified: lucene/hadoop/trunk/src/docs/src/documentation/content/xdocs/site.xml URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/docs/src/documentation/content/xdocs/site.xml?rev=608973&r1=608972&r2=608973&view=diff ============================================================================== --- lucene/hadoop/trunk/src/docs/src/documentation/content/xdocs/site.xml (original) +++ lucene/hadoop/trunk/src/docs/src/documentation/content/xdocs/site.xml Fri Jan 4 11:52:39 2008 @@ -22,6 +22,7 @@ + @@ -33,13 +34,20 @@ - + + + + + + + + @@ -63,6 +71,11 @@ + + + + + @@ -99,6 +112,9 @@ + + + @@ -113,6 +129,13 @@ + + + + + + +