Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id EC72E9F6D for ; Thu, 8 Dec 2011 11:05:41 +0000 (UTC) Received: (qmail 66383 invoked by uid 500); 8 Dec 2011 11:05:41 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 66331 invoked by uid 500); 8 Dec 2011 11:05:40 -0000 Mailing-List: contact hdfs-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hdfs-user@hadoop.apache.org Delivered-To: mailing list hdfs-user@hadoop.apache.org Received: (qmail 66323 invoked by uid 99); 8 Dec 2011 11:05:40 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 08 Dec 2011 11:05:40 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of wget.null@googlemail.com designates 209.85.210.48 as permitted sender) Received: from [209.85.210.48] (HELO mail-pz0-f48.google.com) (209.85.210.48) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 08 Dec 2011 11:05:35 +0000 Received: by dadi14 with SMTP id i14so2281903dad.35 for ; Thu, 08 Dec 2011 03:05:15 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=googlemail.com; s=gamma; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=brA3JZwPhamjyfkV0w7N0Ws2KLp/1PKdVehC0/uqfRU=; b=Kcb6onqwQDBrUoDrmSFTIDuG6G0UTaUN+ySSqI53vyjnjtMDPj4BAzIVnyu7SXOmsd iAhP1vu7ZpqVVREIiIAlTMMG4zNQLXWslpy2153hbd/F8zy8lEcTWF+hZF41e8foNjrM 7Zcnz0w71v/xLtB4sWRsaaoLwJdNv1ZLfnOo8= MIME-Version: 1.0 Received: by 10.68.209.36 with SMTP id mj4mr14401467pbc.71.1323342314816; Thu, 08 Dec 2011 03:05:14 -0800 (PST) Received: by 10.142.250.29 with HTTP; Thu, 8 Dec 2011 03:05:14 -0800 (PST) In-Reply-To: <7D9AF4B98807C54EBADEA75DF6D5ACB7E32FF68C@NDA-HCLC-EVS04.HCLC.CORP.HCL.IN> References: <7D9AF4B98807C54EBADEA75DF6D5ACB7E315C429@NDA-HCLC-EVS04.HCLC.CORP.HCL.IN> <7D9AF4B98807C54EBADEA75DF6D5ACB7E315C770@NDA-HCLC-EVS04.HCLC.CORP.HCL.IN> <7D9AF4B98807C54EBADEA75DF6D5ACB7E32FF68C@NDA-HCLC-EVS04.HCLC.CORP.HCL.IN> Date: Thu, 8 Dec 2011 12:05:14 +0100 Message-ID: Subject: Re: Best option for mounting HDFS From: alo alt To: hdfs-user@hadoop.apache.org Content-Type: multipart/alternative; boundary=14dae9c097e49745be04b392a4c8 --14dae9c097e49745be04b392a4c8 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Hi Stuti, I've set up Lucy LTS to figure out our build-problem earlier today, so I can also help you there ;) install the repo: #> wget http://archive.cloudera.com/one-click-install/lucid/cdh3-repository_1.0_all= .deb&& dpkg -i cdh3-repository_1.0_all.deb #> apt-get update #>apt-cache search hadoop print me out a list of all packages. #> apt-get install hadoop-0.20-fuse install it. please give a try. - alex On Thu, Dec 8, 2011 at 11:09 AM, Stuti Awasthi wrote= : > Hi Alexander, > > I want to try with CDH for mount option of HDFS. I have Ubuntu machine. I > am following : https://ccp.cloudera.com/display/CDHDOC/Mountable+HDFS > But when I do =E2=80=9Csudo apt-get install hadoop-0.20-fuse=E2=80=9D, I= am getting error > of no package found. > I already have Hadoop running version Hadoop 0.20.2 > > Are there any more steps before I try to download Hadoop-0.20-fuse packag= e. > > Thanks > > From: Alexander C.H. Lorenz [mailto:wget.null@googlemail.com] > Sent: Wednesday, November 30, 2011 2:34 PM > To: hdfs-user@hadoop.apache.org > Subject: Re: Best option for mounting HDFS > > Hi, > > I wrote up a small article about, that works in some installations I > managed. > http://mapredit.blogspot.com/2011/11/nfs-exported-hdfs-cdh3.html > > I would suggest to use NFS4, if available in your distro. > On Wed, Nov 30, 2011 at 6:10 AM, Stuti Awasthi > wrote: > Hey Joey, > Thanks for update :). I will try both as you have suggested . > > -----Original Message----- > From: Joey Echeverria [mailto:joey@cloudera.com] > Sent: Tuesday, November 29, 2011 7:25 PM > To: hdfs-user@hadoop.apache.org > Subject: Re: Best option for mounting HDFS > > Hey Stuti, > > Fuse is probably the most commonly used solution. It has some limitations > because HDFS isn't posix compliant, but it it works for a lot of use case= s. > You can try out both the contrib driver and the google code version. I'm > not sure which will work better for your Hadoop version. Newer Hadoop > releases have a lot of fuse related improvements. > > -Joey > > On Tue, Nov 29, 2011 at 2:44 AM, Alexander C.H. Lorenz < > wget.null@googlemail.com> wrote: > > Hi Stuti, > > I have good experience with FUSE (http://fuse.sourceforge.net/), but > > thats not a recommandation. > > - alex > > > > On Tue, Nov 29, 2011 at 5:38 AM, Stuti Awasthi > wrote: > >> > >> Hi Friends , > >> Any thoughts on this ?? > >> > >> -----Original Message----- > >> From: Stuti Awasthi > >> Sent: Monday, November 28, 2011 2:51 PM > >> To: hdfs-user@hadoop.apache.org > >> Subject: Best option for mounting HDFS > >> > >> Hi all, > >> > >> I was looking at various options available to mount HDFS on unix > >> boxes. I found following option on wiki page. > >> * contrib/fuse-dfs is built on fuse, some C glue, libhdfs and the > >> hadoop-dev.jar > >> > >> * fuse-j-hdfs is built on fuse, fuse for java, and the hadoop-dev.jar > >> > >> * hdfs-fuse - a google code project is very similar to > >> contrib/fuse-dfs > >> > >> * webdav - hdfs exposed as a webdav resource > >> > >> * mapR - contains a closed source hdfs compatible file system that > >> supports read/write NFS access > >> > >> I tried webdav but then its integration problem with LDAP. In my > >> scenario I wanted to mount HDFS and apply LDAP authentication over tha= t > mount point. > >> I wanted to know out of the above , which will work best for this > >> scenario. > >> Please Suggest > >> > >> Stuti > >> > >> ::DISCLAIMER:: > >> > >> --------------------------------------------------------------------- > >> -------------------------------------------------- > >> > >> The contents of this e-mail and any attachment(s) are confidential > >> and intended for the named recipient(s) only. > >> It shall not attach any liability on the originator or HCL or its > >> affiliates. Any views or opinions presented in this email are solely > >> those of the author and may not necessarily reflect the opinions of > >> HCL or its affiliates. > >> Any form of reproduction, dissemination, copying, disclosure, > >> modification, distribution and / or publication of this message > >> without the prior written consent of the author of this e-mail is > >> strictly prohibited. If you have received this email in error please > >> delete it and notify the sender immediately. Before opening any mail > >> and attachments please check them for viruses and defect. > >> > >> > >> --------------------------------------------------------------------- > >> -------------------------------------------------- > > > > > > > > -- > > Alexander Lorenz > > http://mapredit.blogspot.com > > P Think of the environment: please don't print this email unless you > > really need to. > > > > > > > > -- > Joseph Echeverria > Cloudera, Inc. > 443.305.9434 > > > > > -- > Alexander Lorenz > http://mapredit.blogspot.com > > =EF=81=90=EF=82=A0Think of the environment: please don't print this email= unless you > really need to. > > > --=20 Alexander Lorenz http://mapredit.blogspot.com *P **Think of the environment: please don't print this email unless you really need to.* --14dae9c097e49745be04b392a4c8 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Hi Stuti,

I've set up Lucy LTS to figure out our bui= ld-problem earlier today, so I can also help you there ;)

install the repo:
#> apt-get update
#>apt-cache search hadoop
=
print me out a list of all packages.

#>=C2=A0apt-get install hadoop-0.20-fuse

inst= all it.

please give a try.

- alex


On Thu, Dec 8, 2011 at = 11:09 AM, Stuti Awasthi <stutiawasthi@hcl.com> wrote:
Hi Alexander,

I want to try with CDH for mount option of HDFS. I have Ubuntu machine. I a= m following : https://ccp.cloudera.com/display/CDHDOC/Mountable+H= DFS
But when I do =C2=A0=E2=80=9Csudo apt-get install hadoop-0.20-fuse=E2=80=9D= , I am getting error of no package found.
I already have Hadoop running version Hadoop 0.20.2

Are there any more steps before I try to download Hadoop-0.20-fuse package.=

Thanks

From: Alexander C.H. Lorenz [mailto:wget.null@googlemail.com]
Sent: Wednesday, November 30, 2011 2:34 PM
To: h= dfs-user@hadoop.apache.org
Subject: Re: Best option for mounting HDFS

Hi,

I wrote up a small article about, that works in some installations I manage= d.
http://mapredit.blogspot.com/2011/11/nfs-exported-hdfs-= cdh3.html

I would suggest to use NFS4, if available in your distro.
On Wed, Nov 30, 2011 at 6:10 AM, Stuti Awasthi <stutiawasthi@hcl.com> wrote:
Hey Joey,
Thanks for update :). I will try both as you have suggested .

-----Original Message-----
From: Joey Echeverria [mailto:joey@clo= udera.com]
Sent: Tuesday, November 29, 2011 7:25 PM
To: hdfs-user@hadoop.apache.= org
Subject: Re: Best option for mounting HDFS

Hey Stuti,

Fuse is probably the most commonly used solution. It has some limitations b= ecause HDFS isn't posix compliant, but it it works for a lot of use cas= es. You can try out both the contrib driver and the google code version. I&= #39;m not sure which will work better for your Hadoop version. Newer Hadoop= releases have a lot of fuse related improvements.

-Joey

On Tue, Nov 29, 2011 at 2:44 AM, Alexander C.H. Lorenz <wget.null@googlemail.com> wrote:
> Hi Stuti,
> I have good experience with FUSE (http://fuse.sourceforge.net/), but
> thats not a recommandation.
> - alex
>
> On Tue, Nov 29, 2011 at 5:38 AM, Stuti Awasthi <stutiawasthi@hcl.com> wrote:
>>
>> Hi Friends ,
>> Any thoughts on this ??
>>
>> -----Original Message-----
>> From: Stuti Awasthi
>> Sent: Monday, November 28, 2011 2:51 PM
>> To: hdfs-user@hadoo= p.apache.org
>> Subject: Best option for mounting HDFS
>>
>> Hi all,
>>
>> I was looking at various options available to mount HDFS on unix >> boxes. I found following option on wiki page.
>> * contrib/fuse-dfs is built on fuse, some C glue, libhdfs and the<= br> >> hadoop-dev.jar
>>
>> * fuse-j-hdfs is built on fuse, fuse for java, and the hadoop-dev.= jar
>>
>> * hdfs-fuse - a google code project is very similar to
>> contrib/fuse-dfs
>>
>> * webdav - hdfs exposed as a webdav resource
>>
>> * mapR - contains a closed source hdfs compatible file system that=
>> supports read/write NFS access
>>
>> I tried webdav but then its integration problem with LDAP. In my >> scenario I wanted to mount HDFS and apply LDAP authentication over= that mount point.
>> I wanted to know out of the above , which will work best for this<= br> >> scenario.
>> Please Suggest
>>
>> Stuti
>>
>> ::DISCLAIMER::
>>
>> ------------------------------------------------------------------= ---
>> --------------------------------------------------
>>
>> The contents of this e-mail and any attachment(s) are confidential=
>> and intended for the named recipient(s) only.
>> It shall not attach any liability on the originator or HCL or its<= br> >> affiliates. Any views or opinions presented in this email are sole= ly
>> those of the author and may not necessarily reflect the opinions o= f
>> HCL or its affiliates.
>> Any form of reproduction, dissemination, copying, disclosure,
>> modification, distribution and / or publication of this message >> without the prior written consent of the author of this e-mail is<= br> >> strictly prohibited. If you have received this email in error plea= se
>> delete it and notify the sender immediately. Before opening any ma= il
>> and attachments please check them for viruses and defect.
>>
>>
>> ------------------------------------------------------------------= ---
>> --------------------------------------------------
>
>
>
> --
> Alexander Lorenz
> http://mapr= edit.blogspot.com
> P=C2=A0Think of the environment: please don't print this email unl= ess you
> really need to.
>
>



--
Joseph Echeverria
Cloudera, Inc.
443.305.9434




--
Alexander Lorenz
http://mapredit.= blogspot.com

=EF=81=90=EF=82=A0Think of the environment: please don't pr= int this email unless you really need to.





--
Alexand= er Lorenz

P=C2=A0Think of the environment: please don't print this email u= nless you really need to.


--14dae9c097e49745be04b392a4c8--