Return-Path: Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: (qmail 35718 invoked from network); 28 Oct 2009 01:22:43 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.3) by minotaur.apache.org with SMTP; 28 Oct 2009 01:22:43 -0000 Received: (qmail 60941 invoked by uid 500); 28 Oct 2009 01:22:36 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 60869 invoked by uid 500); 28 Oct 2009 01:22:36 -0000 Mailing-List: contact common-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: common-user@hadoop.apache.org Delivered-To: mailing list common-user@hadoop.apache.org Received: (qmail 60844 invoked by uid 500); 28 Oct 2009 01:22:36 -0000 Delivered-To: apmail-hadoop-core-user@hadoop.apache.org Received: (qmail 60835 invoked by uid 99); 28 Oct 2009 01:22:36 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 28 Oct 2009 01:22:36 +0000 X-ASF-Spam-Status: No, hits=1.5 required=10.0 tests=NORMAL_HTTP_TO_IP,SPF_PASS,WEIRD_PORT X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of dachuy@gmail.com designates 209.85.160.50 as permitted sender) Received: from [209.85.160.50] (HELO mail-pw0-f50.google.com) (209.85.160.50) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 28 Oct 2009 01:22:24 +0000 Received: by pwi4 with SMTP id 4so535078pwi.29 for ; Tue, 27 Oct 2009 18:22:02 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=gamma; h=domainkey-signature:received:received:message-id:date:from :user-agent:mime-version:to:cc:subject:references:in-reply-to :content-type:content-transfer-encoding; bh=Jpa2cTYy2HRqDv4j/YINRr2h+m3qzybGKnPWgNxz8wg=; b=N9Itdv7Ubz67WTDSfO8mAZszRx2EnNdy4sSWDgqrIKbOb/os4xpGh8Ht4ERZxRXwbk 6I5c6YxEhXrOfHQu//5d6eXmNMZOQOVCE2DkkgIc5Ub6FvptUbGz9pN0EHHkxclO5Qq1 IToY6FVK9jORtTQnGmXJauP3WbqscGXWLYVqM= DomainKey-Signature: a=rsa-sha1; c=nofws; d=gmail.com; s=gamma; h=message-id:date:from:user-agent:mime-version:to:cc:subject :references:in-reply-to:content-type:content-transfer-encoding; b=TTwKFUAtaQHPZ0dSgXBBoemfjlSS4EIyZfpCEJbrGVjJEyc+mgi4wJZnRvg4KM3wa6 mvZsD8rwhs2mSoWP56don/fvLX5n7+wxVaA+ePrjq2jH+0cAHqg0w0fZ0/RsLZ0qVgQM 5s/4pwPSb8aGP3iwJLijXKYPJ/TX+4yaSB3D4= Received: by 10.115.113.14 with SMTP id q14mr10978445wam.178.1256692921605; Tue, 27 Oct 2009 18:22:01 -0700 (PDT) Received: from ?192.168.1.100? ([118.68.39.208]) by mx.google.com with ESMTPS id 23sm259425pzk.8.2009.10.27.18.21.58 (version=SSLv3 cipher=RC4-MD5); Tue, 27 Oct 2009 18:22:00 -0700 (PDT) Message-ID: <4AE79CDE.2010600@gmail.com> Date: Wed, 28 Oct 2009 08:22:38 +0700 From: Huy Phan User-Agent: Thunderbird 2.0.0.23 (X11/20090817) MIME-Version: 1.0 To: "Zhang Bingjun (Eddy)" CC: common-user@hadoop.apache.org, core-user@hadoop.apache.org, hdfs-dev@hadoop.apache.org, hdfs-user@hadoop.apache.org Subject: Re: Mount WebDav in Linux for HDFS-0.20.1 References: <4AE6D1BD.8080005@gmail.com> <4AE6D73C.80601@gmail.com> In-Reply-To: Content-Type: text/plain; charset=UTF-8; format=flowed Content-Transfer-Encoding: 7bit X-Virus-Checked: Checked by ClamAV on apache.org Hi Zhang, I applied my patch to davfs2-1.4.0 and it's working fine with Hadoop 0.20.1. If you didn't define any access restriction in account.properties file, you can ignore the authentication when mounting davfs2. Best, Huy Phan Zhang Bingjun (Eddy) wrote: > Dear Huy Phan, > > I downloaded davfs2-1.4.3 and in this version the patch you sent me > seems to be applied already. I compiled and installed this version. > However, the error message is still around like below... > > hadoop@hdfs2:/mnt$ sudo mount.davfs http://192.168.0.131:9800 hdfs-webdav/ > Please enter the username to authenticate with server > http://192.168.0.131:9800 or hit enter for none. > Username: hadoop > Please enter the password to authenticate user hadoop with server > http://192.168.0.131:9800 or hit enter for none. > Password: > mount.davfs: mounting failed; the server does not support WebDAV > > Which username or password should I input? Any user in the > account.properties file or the user in the WebDAV OS? > > Regarding the memory leak in fuse-dfs and libhdfs, I posted one patch > in apache jira. However, when used in production environment, the > memory leak still exists and cause the mounting point unusable after a > number of write/read operations. The memory leak there is really > annoying... > > I hope I can setup the mix of davfs2 and WebDAV to have a try on its > performance. Any ideas to get around the error "mount failed; the > server does not support WebDAV"? > > Thank you so much for your help! > > Best regards, > Zhang Bingjun (Eddy) > > E-mail: eddymier@gmail.com , > bingjun@nus.edu.sg , > bingjun@comp.nus.edu.sg > Tel No: +65-96188110 (M) > > > On Tue, Oct 27, 2009 at 7:19 PM, Huy Phan > wrote: > > Hi Zhang, > I didn't play much with fuse-dfs, in my opinion, memory leak is > something solvable and I can see Apache had made some fixes for > this issue on libhdfs. > If you encounter these problems with older version of Hadoop, I > think you should give a try on the latest stable version. > Since I didn't have much fun so far with fuse-dfs, i cannot say > it's the best or not, but it's definitely better than mixing > davfs2 and webdav together. > > > Best, > Huy Phan > > Zhang Bingjun (Eddy) wrote: > > Dear Huy Phan, > > > Thanks for your quick reply. > I was using fuse-dfs before. But I found serious memory leak > with fuse-dfs about 10MB leakage per 10k file read/write. When > the occupied memory size reached about 150MB, the read/write > performance dropped dramatically. Did you encounter these > problems? > > What I am trying to do is to mount HDFS as a local directory > in Ubuntu. Do you think fuse-dfs is the best option so far? > > Thank you so much for your input! > > Best regards, > Zhang Bingjun (Eddy) > > E-mail: eddymier@gmail.com > >, > bingjun@nus.edu.sg > >, > bingjun@comp.nus.edu.sg > > > Tel No: +65-96188110 (M) > > > On Tue, Oct 27, 2009 at 6:55 PM, Huy Phan >> wrote: > > Hi Zhang, > > Here is the patch for davfs2 to solve "server does not support > WebDAV" issue: > > diff --git a/src/webdav.c b/src/webdav.c > index 8ec7a2d..4bdaece 100644 > --- a/src/webdav.c > +++ b/src/webdav.c > @@ -472,7 +472,7 @@ dav_init_connection(const char *path) > > if (!ret) { > initialized = 1; > - if (!caps.dav_class1 && !caps.dav_class2 && > !ignore_dav_header) { > + if (!caps.dav_class1 && !ignore_dav_header) { > if (have_terminal) { > error(EXIT_FAILURE, 0, > _("mounting failed; the server does not > support WebDAV")); > > > davfs2 and webdav is not a good mix actually, I had tried > to mix > them together and the performance were really bad. With the > load > test of 10 requests/s, load average on my namenode were > always > > 15 and it took me about 5 mins for `ls` the root directory > of HDFS > during the test. > > Since you're using Hadoop 0.20.1, it's better to use fusedfs > library provided in Hadoop package. You have to do some > tricks to > compile fusedfs with Hadoop, otherwise it would take you a > lot of > time for compiling redundant things. > > Best, > Huy Phan > > Zhang Bingjun (Eddy) wrote: > > Dear Huy Phan and others, > > Thanks a lot for your efforts in customizing the WebDav > server > and make > it work > for Hadoop-0.20.1. > After setting up the WebDav server, I could access it using > Cadaver client in Ubuntu without using any username > password. > Operations like deleting files, etc, were working. The > command > is: *cadaver http://server:9800* > > However, when I was trying to mount the WebDav server using > davfs2 in Ubuntu, I always get the following error: > "mount.davfs: mounting failed; the server does not support > WebDAV". > > I was promoted to input username and password like below: > hadoop@hdfs2:/mnt$ sudo mount.davfs > http://192.168.0.131:9800/test hdfs-webdav/ > Please enter the username to authenticate with server > http://192.168.0.131:9800/test or hit enter for none. > Username: hadoop > Please enter the password to authenticate user hadoop > with server > http://192.168.0.131:9800/test or hit enter for none. > Password: > mount.davfs: mounting failed; the server does not > support WebDAV > > Even though I have tried all possible usernames and > passwords > either from the WebDAV accounts.properties file or from the > Ubuntu system of the WebDAV server, I still got this error > message. > Could you and anyone give me some hints on this > problem? How > could I solve it? Very much appreciate your help! > > Best regards, > Zhang Bingjun (Eddy) > > E-mail: eddymier@gmail.com > > > > >>, > > bingjun@nus.edu.sg > > > > >>, > > bingjun@comp.nus.edu.sg > > > > > >> > > > Tel No: +65-96188110 (M) > > > > >