hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Yu <yuzhih...@gmail.com>
Subject Re: HDFS and Linux File System
Date Mon, 07 Sep 2009 21:44:07 GMT
I tried to compile fuse-dfs. libhdfs.so has been compiled.

Under hadoop/src/contrib/fuse-dfs:
ant -Dlibhdfs=1 -Dfusedfs=1

Then I got:
     [exec] make[1]: Entering directory
`/usr/local/hadoop/src/contrib/fuse-dfs/src'
     [exec] if gcc -DPACKAGE_NAME=\"fuse_dfs\"
-DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\"
-DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\"
-DGETGROUPS_T=gid_t -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1
-DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1
-DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1
-DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t  -I. -I.  -DPERMS=1
-D_FILE_OFFSET_BITS=64 -I/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0/include
-I/usr/local/hadoop/src/c++/libhdfs/
-I/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0/include/linux/
-D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3
-MT fuse_dfs.o -MD -MP -MF ".deps/fuse_dfs.Tpo" -c -o fuse_dfs.o fuse_dfs.c;
\
     [exec]     then mv -f ".deps/fuse_dfs.Tpo" ".deps/fuse_dfs.Po"; else rm
-f ".deps/fuse_dfs.Tpo"; exit 1; fi
     [exec] In file included from fuse_dfs.c:19:
     [exec] fuse_dfs.h:31:18: error: fuse.h: No such file or directory
     [exec] fuse_dfs.h:32:27: error: fuse/fuse_opt.h: No such file or
directory
     [exec] In file included from fuse_dfs.c:20:

Where can I find fuse_opt.h and fuse.h ?

Thanks

On Mon, Sep 7, 2009 at 12:08 PM, Brian Bockelman <bbockelm@cse.unl.edu>wrote:

> Hey Ted,
>
> It's hard to avoid copying files, unless if you are able to change your
> application to talk to HDFS directly (and even then, there are a lot of
> "gotchas" that you wouldn't have to put up with at an application level --
> look at the Chukwa paper).
>
> I would advise looking at Chukwa, http://wiki.apache.org/hadoop/Chukwa,
> and then rotating logfiles quickly.
>
> Facebook's Scribe is supposed to do this sort of stuff too (and is very
> impressive), but I'm not familiar with it.  On face value, it appears that
> it might take more effort to get scribe well-integrated, but it would have
> more functionality.
>
> Brian
>
>
> On Sep 7, 2009, at 4:18 AM, Ted Yu wrote:
>
>  We're using hadoop 0.20.0 to analyze large log files from web servers.
>> I am looking for better HDFS support so that I don't have to copy log
>> files
>> from Linux File System over.
>>
>> Please comment.
>>
>> Thanks
>>
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message