hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alexander Lorenz <wget.n...@googlemail.com>
Subject Re: Mounting HDFS
Date Wed, 04 Jan 2012 15:06:47 GMT
Hi Stuti,

Do a search for libhdfs.so* and do also an ldd /path/to/fuse_dfs. Could be that only a symlink
is missing. With ldd you will see which libraries the binary wants, if the libhdfs.so.1 is
not in the path export the path where you found it. 

- Alex

Alexander Lorenz
http://mapredit.blogspot.com

On Jan 4, 2012, at 4:08 AM, Stuti Awasthi <stutiawasthi@hcl.com> wrote:

> I have already exported it in the env. Output of "export" command. 
> 
> declare -x LD_LIBRARY_PATH="/usr/lib:/usr/local/lib:/home/jony/FreshHadoop/hadoop-0.20.2/build/libhdfs:/usr/lib/jvm/java-6-openjdk/jre/lib/i386/server/:/usr/lib/libfuse.so"
> 
> Stuti
> ________________________________________
> From: Harsh J [harsh@cloudera.com]
> Sent: Wednesday, January 04, 2012 5:34 PM
> To: hdfs-user@hadoop.apache.org
> Subject: Re: Mounting HDFS
> 
> Stuti,
> 
> Your env needs to carry this:
> 
> export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/path/to/dir/where/libhdfs/files/are/present
> 
> Otherwise the fuse_dfs binary won't be able to find and load it. The
> wrapper script does this as part of its setup if you read it.
> 
> On Wed, Jan 4, 2012 at 5:29 PM, Stuti Awasthi <stutiawasthi@hcl.com> wrote:
>> Im able to mount using command :
>> 
>> fuse_dfs_wrapper.sh dfs://<server>:<port> /export/hdfs
>> 
>> -----Original Message-----
>> From: Stuti Awasthi
>> Sent: Wednesday, January 04, 2012 5:24 PM
>> To: hdfs-user@hadoop.apache.org
>> Subject: RE: Mounting HDFS
>> 
>> Harsh,
>> 
>> Output of $file `which fuse_dfs`
>> 
>> /sbin/fuse_dfs: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), dynamically
linked (uses shared libs), for GNU/Linux 2.6.15, not stripped
>> 
>> Same output for $ file /sbin/fuse_dfs
>> 
>> Thanks
>> ________________________________________
>> From: Harsh J [harsh@cloudera.com]
>> Sent: Wednesday, January 04, 2012 5:18 PM
>> To: hdfs-user@hadoop.apache.org
>> Subject: Re: Mounting HDFS
>> 
>> Stuti,
>> 
>> My original command was "file `which fuse_dfs`", and not just the which command.
>> 
>> Can you run "file /sbin/fuse_dfs"? You need the utility called 'file' available (its
mostly present).
>> 
>> On 04-Jan-2012, at 5:08 PM, Stuti Awasthi wrote:
>> 
>>> Hi Harsh,
>>> 
>>> Currently I am using 32 bit Ubuntu11.10, Hadoop 0.20.2
>>> 
>>> Output of : $ which fuse_dfs
>>> /sbin/fuse_dfs
>>> 
>>> I searched on net and I got this url "http://wiki.apache.org/hadoop/MountableHDFS"
>>> How can I get hdfs fuse deb or rpm packages ?? Thanks for pointing this, can
you please guide me more on this .
>>> 
>>> Thanks
>>> 
>>> -----Original Message-----
>>> From: Harsh J [mailto:harsh@cloudera.com]
>>> Sent: Wednesday, January 04, 2012 4:51 PM
>>> To: hdfs-user@hadoop.apache.org
>>> Subject: Re: Mounting HDFS
>>> 
>>> Stuti,
>>> 
>>> What's your platform - 32-bits or 64-bits? Which one have you built libhdfs for?
>>> 
>>> What's the output of the following?
>>> $ file `which fuse_dfs`
>>> 
>>> FWIW, the most hassle free way to do these things today is to use proper packages
available for your platform, instead of compiling it by yourself. Just a suggestion.
>>> 
>>> On 04-Jan-2012, at 4:28 PM, Stuti Awasthi wrote:
>>> 
>>>> Hi All,
>>>> 
>>>> I am following http://wiki.apache.org/hadoop/MountableHDFS for HDFS mount.
>>>> I have successfully followed the steps till "Installing" and I am able mount
it properly. After that I am trying with "Deploying" step and followed the steps:
>>>> 
>>>> 1. add the following to /etc/fstab
>>>> fuse_dfs#dfs://hadoop_server.foo.com:9000 /export/hdfs fuse
>>>> -oallow_other,rw,-ousetrash 0 0
>>>> 
>>>> 2. added fuse_dfs to /sbin
>>>> 
>>>> 3. export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/build/libhdfs
>>>> 
>>>> 4. Mount using: mount /export/hdfs.
>>>> 
>>>> But getting error :
>>>> fuse_dfs: error while loading shared libraries : libhdfs.so.0: cannot open
shared object file: No such file or directory.
>>>> 
>>>> How to fix this ?
>>>> 
>>>> Thanks
>>>> 
>>>> ::DISCLAIMER::
>>>> ---------------------------------------------------------------------
>>>> -
>>>> -------------------------------------------------
>>>> 
>>>> The contents of this e-mail and any attachment(s) are confidential and intended
for the named recipient(s) only.
>>>> It shall not attach any liability on the originator or HCL or its
>>>> affiliates. Any views or opinions presented in this email are solely those
of the author and may not necessarily reflect the opinions of HCL or its affiliates.
>>>> Any form of reproduction, dissemination, copying, disclosure,
>>>> modification, distribution and / or publication of this message
>>>> without the prior written consent of the author of this e-mail is
>>>> strictly prohibited. If you have received this email in error please delete
it and notify the sender immediately. Before opening any mail and attachments please check
them for viruses and defect.
>>>> 
>>>> ---------------------------------------------------------------------
>>>> -
>>>> -------------------------------------------------
>>> 
>> 
> 
> 
> 
> --
> Harsh J

Mime
View raw message