hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Weiming Lu <weimin...@gmail.com>
Subject Re: Help with fuse-dfs
Date Wed, 16 Dec 2009 01:54:05 GMT
Thanks for your reply. I have added the "-d" option when calling the
fuse_dfs, but I can't see any stack traces. The ONLY thing I see on
the console is:
port=54310,server=10.15.62.4
fuse-dfs didn't recognize /mnt/dfs,-2
fuse-dfs ignoring option -d
fuse-dfs ignoring option -o
fuse-dfs didn't recognize allow_other,-2
fuse: invalid argument `allow_other'

The option "-d" is ignored. Maybe there is something wrong in my building step.
Can anyone help me? Thanks.


On Wed, Dec 16, 2009 at 12:49 AM, Brian Bockelman <bbockelm@cse.unl.edu> wrote:
> Hey,
>
> One thing you can do is call the fuse_dfs executable with the "-d" option.  This will
keep the FUSE-DFS process in the foreground of the terminal, and you will see any stack traces
that occur within the Hadoop portion.  Very useful for debugging.
>
> Brian
>
> On Dec 15, 2009, at 12:21 AM, Weiming Lu wrote:
>
>> Thanks very much, I have builded fuse-dfs successfully with many
>> tries. But there are still some problems. We can list files in hadoop
>> by "./hadoop fs -ls ". But after mount, no file shown in /mnt/dfs, and
>> there is also no message in /var/log/messages.
>> does anybody encounter this ?
>>
>> The hadoop we used is hadoop 0.18.2, and our platform is ubuntu 8.0.4, amd64.
>>
>> The steps we builded are:
>> 1. modified src/c++/libhdfs/Makefile by
>> adding OS_ARCH=amd64, and JAVA_HOME
>> removing -32m in CPPFLAGS and LDFLAGS
>> 2.modified src/c++/utils/configure and src/c++/pipes/configure by
>> adding OS_ARCH=amd64
>> making it executable by "chmod 755"
>> 3.ln -s /usr/lib/jvm/java-6-sun/jre/lib/amd64/server/libjvm.so /usr/local/lib
>> 4.install fuse with version 2.8.1
>> 5.ant compile-libhdfs -Dlibhdfs=1
>> 6.ant compile-contrib -Dcompile.c++=1 -Dfusedfs=1 -Dlibhdfs=1
>> -Dlibhdfs.noperms=1
>> NOW we have fuse_dfs and fuse_dfs_wrapper.sh in
>> $HADOOP_HOME/build/contrib/fuse-dfs
>>
>> 7. add the following content in fuse_dfs_wrapper.sh
>> #!/bin/bash
>> export JAVA_HOME=/usr/lib/jvm/java-6-sun
>> export OS_NAME=linux
>> export OS_ARCH=amd64
>> export HADOOP_HOME=/home/lwm/work/hadoop
>>
>> 8. ln -s /home/lwm/work/hadoop/build/libhdfs/libhdfs.so
>> /usr/local/lib/libhdfs.so
>>
>> 9. modified /etc/ld.so.conf by adding /usr/local/lib
>> and then ldconfig
>> 10. sudo mkdir /mnt/dfs
>> 11. ./fuse_dfs_wrapper.sh dfs://10.15.62.4:54310 /mnt/dfs
>> we got:
>> port=54310,server=10.15.62.4
>> fuse-dfs didn't recognize /mnt/dfs,-2
>> fuse-dfs ignoring option -d
>> fuse-dfs ignoring option -o
>> fuse-dfs didn't recognize allow_other,-2
>> fuse: invalid argument `allow_other'
>>
>>
>> 2009-12-15, "Hazem Mahmoud" <hmahmoud@gmail.com> wrote:
>>> Here are some notes I took when installing fuse here on our system. This was
>>> for a hadoop 19.1 installation.
>>>
>>>
>>>   Installing Fuse/Hadoop:
>>>
>>> 2.     mkdir /hdfs/client01
>>>
>>> 4.     Install Sun JDK 1.6 (must be Sun)
>>>
>>> a.     RPM: jdk-6u14-ea-linux-amd64.rpm
>>>
>>> b.     This DID NOT WORK:
>>> java version "1.6.0"
>>> OpenJDK  Runtime Environment (build 1.6.0-b09)
>>> OpenJDK 64-Bit Server VM (build 1.6.0-b09, mixed mode)
>>>
>>> c.      Must be this:
>>> java version "1.6.0_14-ea"
>>> Java(TM) SE Runtime Environment (build 1.6.0_14-ea-b04)
>>> Java HotSpot(TM) 64-Bit Server VM (build 14.0-b13, mixed mode)
>>>
>>> 5.     wget
>>> http://newman.ultralight.org/repos/hadoop/4/x86_64/hadoop-0.19.1-7.el4.x86_64.rpm
>>>
>>> 6.     wget
>>> http://newman.ultralight.org/repos/hadoop/4/x86_64/fuse-libs-2.7.4-8_10.el4.x86_64.rpm
>>>
>>> 7.     wget
>>> http://dag.wieers.com/rpm/packages/fuse/fuse-2.7.3-1.el5.rf.x86_64.rpm
>>> OR
>>> wget
>>> http://newman.ultralight.org/repos/hadoop/4/x86_64/fuse-2.7.4-8_10.el4.x86_64.rpm
>>>
>>> 8.     wget
>>> http://newman.ultralight.org/repos/hadoop/4/x86_64/hadoop-fuse-0.19.1-7.el4.x86_64.rpm
>>>
>>> 9.     rpm –ivh hadoop-0.19.1-7.el4.x86_64.rpm
>>>
>>> 10. rpm –ivh fuse-2.7.3-1.el5.rf.x86_64.rpm
>>>
>>> 11. rpm –ivh hadoop-fuse-0.19.1-7.el4.x86_64.rpm
>>>
>>> 12. rpm –ivh fuse-libs-2.7.4-8_10.el4.x86_64.rpm
>>>
>>> 13. yum install hadoop
>>>
>>> 14. yum install hadoop-fuse fuse-libs
>>>
>>> 15. In /etc/fstab:
>>>
>>> a.     hdfs# /hdfs/client01 fuse
>>> server=<server_hostname>,port=9000,rdbuffer=1048576,allow_other,big_writes
0
>>> 0
>>>
>>>
>>> 16. Then execute command:
>>>
>>> a.     mount /hdfs/client01
>>>
>>> b.      ls /hdfs/client01
>>>
>>> 17. References:
>>>
>>> a.     https://twiki.grid.iu.edu/bin/view/Storage/HadoopInstallation
>>>
>>> b.     http://wiki.apache.org/hadoop/MountableHDFS
>>>
>>> 18. Error:
>>>
>>> a.     After performing a “yum upgrade” on the entire system it broke the
>>> fuse installation:
>>>
>>>                                              i.    
[root@host1 ~]#
>>> modprobe fuse
>>> FATAL: Module fuse not found.
>>>
>>>                                             ii.     [root@host1
~]# mount
>>> /hdfs/client01
>>> port=11091,server=#A#####
>>> fuse-dfs didn't recognize /hdfs/client01,-2
>>> fuse-dfs ignoring option allow_other
>>> fuse: device not found, try 'modprobe fuse' first
>>>
>>> b.     Solution: Reinstall newer RPM’s:
>>>
>>>                                              i.
>>> fuse-2.7.4-8_10.el5.x86_64.rpm
>>>
>>>                                             ii.
>>> fuse-kmdl-2.6.18-128.1.10.el5-2.7.4-8_10.el5.x86_64.rpm
>>>
>>>                                           iii.
>>> fuse-libs-2.7.4-8_10.el5.x86_64.rpm
>>>
>>>                                           iv.
>>> hadoop-0.19.1-7.el5.x86_64.rpm
>>>
>>>                                             v.
>>> hadoop-fuse-0.19.1-8.el5.x86_64.rpm
>>>
>>>
>>>
>>>
>>>
>>> 2009/12/13 lwm <lwm_zju@126.com>
>>>
>>>> Hi all
>>>> I have installed hadoop with version 0.18.2, and I want to use fuse in
>>>> hadoop. Following the src/contrib/fuse-dfs/README, I executed "ant
>>>> compile-contrib -Dlibhdfs=1 -Dfusedfs=1 ", an error occured. I can't fix
it.
>>>> Can anyone help me? Or is there a good install guide for me?
>>>> thanks.
>>>> When I executed "ant compile-contrib -Dcompile.c++=1 -Dfusedfs=1
>>>> -Dlibhdfs.noperms=1", it was builded successfully.
>>>>
>>>> compile:
>>>>    [echo] contrib: fuse-dfs
>>>>    [exec] automake: Makefile.am: required file `./NEWS' not found
>>>>    [exec] automake: Makefile.am: required file `./AUTHORS' not found
>>>>    [exec] automake: Makefile.am: required file `./ChangeLog' not found
>>>>    [exec] src/Makefile.am:19: invalid unused variable name: `AM_LDFLAGS'
>>>>    [exec] configure: error: cannot run /bin/bash ./config.sub
>>>> BUILD FAILED
>>>> /home/wm/work/hadoop/build.xml:410: The following error occurred while
>>>> executing this line:
>>>> /home/wm/work/hadoop/src/contrib/build.xml:30: The following error occurred
>>>> while executing this line:
>>>> /home/wm/work/hadoop/src/contrib/fuse-dfs/build.xml:54: exec returned: 1
>>>>
>>>>
>>>>
>>>
>>>
>>> --
>
>

Mime
View raw message