Return-Path: Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: (qmail 6019 invoked from network); 8 Sep 2009 04:23:43 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.3) by minotaur.apache.org with SMTP; 8 Sep 2009 04:23:43 -0000 Received: (qmail 27569 invoked by uid 500); 8 Sep 2009 04:23:41 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 27497 invoked by uid 500); 8 Sep 2009 04:23:41 -0000 Mailing-List: contact common-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: common-user@hadoop.apache.org Delivered-To: mailing list common-user@hadoop.apache.org Received: (qmail 27487 invoked by uid 99); 8 Sep 2009 04:23:40 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 08 Sep 2009 04:23:40 +0000 X-ASF-Spam-Status: No, hits=2.2 required=10.0 tests=HTML_MESSAGE,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of yuzhihong@gmail.com designates 209.85.221.171 as permitted sender) Received: from [209.85.221.171] (HELO mail-qy0-f171.google.com) (209.85.221.171) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 08 Sep 2009 04:23:33 +0000 Received: by qyk1 with SMTP id 1so2149506qyk.22 for ; Mon, 07 Sep 2009 21:23:12 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=gamma; h=domainkey-signature:mime-version:received:in-reply-to:references :date:message-id:subject:from:to:content-type; bh=hQcp/NXKrGvtcH+7Jh8atTR2FvNoe2BXmGC3npwEeBc=; b=L34WmFpnqhM4o+Pfw54jN0OoSKoEQCBf9opaj3GNg7JPs35CcWJ+x+MFWfZaqhkhBd Hq86FF6Q/RahlGiqwBxYMwP4Y2SLqNCy8AF8NvbNeF/ivJsQTm1rnJukFdGsHGA5KfXb 1odSRJappQPdxpjYY6qV1xAJ00NAIdg8KXlyU= DomainKey-Signature: a=rsa-sha1; c=nofws; d=gmail.com; s=gamma; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; b=cdRzRoOZfSvgGXe/8kdy1SepJf3jXAAfi5GHkg+wj0TekaZnvBIvMUCGe5lQmPy12N HjqPA4w+BVHdeq2+CQz6yX3H4VaqtwfKLSi9eWDZAciBS4ZymGGqyn54VQ3XqLS+iMVn CwE3w6Qy3NAa2e8jBbybh/wTgblnvyJV2UVhQ= MIME-Version: 1.0 Received: by 10.229.14.211 with SMTP id h19mr3916569qca.42.1252383792683; Mon, 07 Sep 2009 21:23:12 -0700 (PDT) In-Reply-To: <35538fbe0909071621x4c07d1baod665ee347f186501@mail.gmail.com> References: <17e273100909070218v4659e2dcu8598207843adeed6@mail.gmail.com> <2EBE5668-CC80-4151-8577-2A923C4A1AFF@cse.unl.edu> <17e273100909071444h664758e7tb6f5260edacad07a@mail.gmail.com> <17e273100909071503i2fa08ec8q312c8ab06a074dca@mail.gmail.com> <35538fbe0909071621x4c07d1baod665ee347f186501@mail.gmail.com> Date: Mon, 7 Sep 2009 21:23:12 -0700 Message-ID: <17e273100909072123j21be9ac4ne3a029d8113d43d7@mail.gmail.com> Subject: Re: HDFS and Linux File System From: Ted Yu To: common-user@hadoop.apache.org Content-Type: multipart/alternative; boundary=001517576c5e15dc2b04730953e9 X-Virus-Checked: Checked by ClamAV on apache.org --001517576c5e15dc2b04730953e9 Content-Type: text/plain; charset=ISO-8859-1 I see "Repository moved temporarily to '/viewvc/hadoop/chukwa/'. Please relocate" On Mon, Sep 7, 2009 at 4:21 PM, Matt Massie wrote: > You can find chukwa at... > > http://svn.apache.org/viewvc/hadoop/chukwa/ > > -Matt > > > On Mon, Sep 7, 2009 at 3:03 PM, Ted Yu wrote: > > > > http://svn.apache.org/viewvc/hadoop/core/trunk/src/contrib/chukwa/produced > > an exception. > > And Chukwa isn't in hadoop/src/contrib of 0.20.0 > > > > On Mon, Sep 7, 2009 at 2:44 PM, Ted Yu wrote: > > > > > I tried to compile fuse-dfs. libhdfs.so has been compiled. > > > > > > Under hadoop/src/contrib/fuse-dfs: > > > ant -Dlibhdfs=1 -Dfusedfs=1 > > > > > > Then I got: > > > [exec] make[1]: Entering directory > > > `/usr/local/hadoop/src/contrib/fuse-dfs/src' > > > [exec] if gcc -DPACKAGE_NAME=\"fuse_dfs\" > > > -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" > > > -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" > > > -DGETGROUPS_T=gid_t -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 > > > -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 > -DHAVE_MEMORY_H=1 > > > -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 > > -DHAVE_UNISTD_H=1 > > > -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I. -I. -DPERMS=1 > > > -D_FILE_OFFSET_BITS=64 > -I/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0/include > > > -I/usr/local/hadoop/src/c++/libhdfs/ > > > -I/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0/include/linux/ > > > -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include -Wall > > -O3 > > > -MT fuse_dfs.o -MD -MP -MF ".deps/fuse_dfs.Tpo" -c -o fuse_dfs.o > > fuse_dfs.c; > > > \ > > > [exec] then mv -f ".deps/fuse_dfs.Tpo" ".deps/fuse_dfs.Po"; > else > > > rm -f ".deps/fuse_dfs.Tpo"; exit 1; fi > > > [exec] In file included from fuse_dfs.c:19: > > > [exec] fuse_dfs.h:31:18: error: fuse.h: No such file or directory > > > [exec] fuse_dfs.h:32:27: error: fuse/fuse_opt.h: No such file or > > > directory > > > [exec] In file included from fuse_dfs.c:20: > > > > > > Where can I find fuse_opt.h and fuse.h ? > > > > > > Thanks > > > > > > > > > On Mon, Sep 7, 2009 at 12:08 PM, Brian Bockelman > >wrote: > > > > > >> Hey Ted, > > >> > > >> It's hard to avoid copying files, unless if you are able to change > your > > >> application to talk to HDFS directly (and even then, there are a lot > of > > >> "gotchas" that you wouldn't have to put up with at an application > level > > -- > > >> look at the Chukwa paper). > > >> > > >> I would advise looking at Chukwa, > http://wiki.apache.org/hadoop/Chukwa, > > >> and then rotating logfiles quickly. > > >> > > >> Facebook's Scribe is supposed to do this sort of stuff too (and is > very > > >> impressive), but I'm not familiar with it. On face value, it appears > > that > > >> it might take more effort to get scribe well-integrated, but it would > > have > > >> more functionality. > > >> > > >> Brian > > >> > > >> > > >> On Sep 7, 2009, at 4:18 AM, Ted Yu wrote: > > >> > > >> We're using hadoop 0.20.0 to analyze large log files from web > servers. > > >>> I am looking for better HDFS support so that I don't have to copy log > > >>> files > > >>> from Linux File System over. > > >>> > > >>> Please comment. > > >>> > > >>> Thanks > > >>> > > >> > > >> > > > > > > --001517576c5e15dc2b04730953e9--