Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id B58CB7B90 for ; Fri, 9 Dec 2011 16:43:58 +0000 (UTC) Received: (qmail 50901 invoked by uid 500); 9 Dec 2011 16:43:57 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 50843 invoked by uid 500); 9 Dec 2011 16:43:57 -0000 Mailing-List: contact hdfs-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hdfs-user@hadoop.apache.org Delivered-To: mailing list hdfs-user@hadoop.apache.org Received: (qmail 50835 invoked by uid 99); 9 Dec 2011 16:43:57 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 09 Dec 2011 16:43:57 +0000 X-ASF-Spam-Status: No, hits=2.2 required=5.0 tests=HTML_MESSAGE,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of stutiawasthi@hcl.com designates 203.105.185.23 as permitted sender) Received: from [203.105.185.23] (HELO gws05.hcl.com) (203.105.185.23) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 09 Dec 2011 16:43:50 +0000 Received: from NDA-HCLIN-HT03.CORP.HCL.IN (10.248.64.34) by NDA-HCLIN-EDGE3.hcl.in (10.248.64.140) with Microsoft SMTP Server id 8.2.254.0; Fri, 9 Dec 2011 22:14:26 +0530 Received: from NDA-HCLC-HT02.HCLC.CORP.HCL.IN (10.33.64.134) by NDA-HCLIN-HT03.CORP.HCL.IN (10.248.64.34) with Microsoft SMTP Server (TLS) id 8.2.176.0; Fri, 9 Dec 2011 22:13:26 +0530 Received: from NDA-HCLC-EVS04.HCLC.CORP.HCL.IN ([10.33.64.194]) by NDA-HCLC-HT02.HCLC.CORP.HCL.IN ([::1]) with mapi; Fri, 9 Dec 2011 22:13:25 +0530 From: Stuti Awasthi To: "hdfs-user@hadoop.apache.org" Date: Fri, 9 Dec 2011 22:13:24 +0530 Subject: RE: 0.23 build fuse-dfs contrib Thread-Topic: 0.23 build fuse-dfs contrib Thread-Index: Acy2jFZ2QMzc3SsjTKWHXWH8OnKGwgABU0UQ Message-ID: <7D9AF4B98807C54EBADEA75DF6D5ACB7E32FFAEB@NDA-HCLC-EVS04.HCLC.CORP.HCL.IN> References: <4EE23195.7030002@gmail.com> In-Reply-To: <4EE23195.7030002@gmail.com> Accept-Language: en-US Content-Language: en-US X-MS-Has-Attach: X-MS-TNEF-Correlator: acceptlanguage: en-US Content-Type: multipart/alternative; boundary="_000_7D9AF4B98807C54EBADEA75DF6D5ACB7E32FFAEBNDAHCLCEVS04HCL_" MIME-Version: 1.0 X-Virus-Checked: Checked by ClamAV on apache.org --_000_7D9AF4B98807C54EBADEA75DF6D5ACB7E32FFAEBNDAHCLCEVS04HCL_ Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: quoted-printable Thanks Petru for sharing this :) From: Petru Dimulescu [mailto:petru.dimulescu@gmail.com] Sent: Friday, December 09, 2011 9:35 PM To: hdfs-user@hadoop.apache.org Subject: 0.23 build fuse-dfs contrib Hello, this mail originated as a question, in the meantime I found the solu= tion, so it might help someone hopefully. If you want to build fuse-dfs on 0,23 snapshot branch, on ubuntu linux 11.1= 0 (important, as you'll see:), First you need to do a mvn package -Pnative in hadoop-common/hadoop-hdfs-pr= oject/hadoop-hdfs. You'll get a target/native/ subdir, go there and do a ma= ke install so that you'll have libhdfs.so in a system libdir. Then I went to hadoop-hdfs/src/contrib and typed: $ ant compile -Dfusedfs=3D1 it complained about not having hadoop-common/hadoop-hdfs-project/hadoop-hdf= s/ivy/libraries.properties -- that (empty) file is in hadoop-common/hadoop-= hdfs-project/hadoop-hdfs/src/contrib/fuse-dfs/ivy/libraries.properties). Af= ter I copied it in the expected place, I got : [exec] In file included from fuse_impls.h:26:0, [exec] from fuse_dfs.c:21: [exec] fuse_context_handle.h:22:18: fatal error: hdfs.h: No such file = or directory If I copied hdfs.h from the src/main/native dir to /usr/local/include, just= to make it happy quickly. Next error: [exec] fuse_impls_write.c: In function 'dfs_write': [exec] fuse_impls_write.c:38:16: warning: cast to pointer from integer= of different size [-Wint-to-pointer-cast] [exec] fuse_dfs.o: In function `is_protected': [exec] /home/petru/work/ubeeko/hadoo.apache.org/0.23/hadoop-common/had= oop-hdfs-project/hadoop-hdfs/src/contrib/fuse-dfs/src/fuse_dfs.c:27: undefi= ned reference to `fuse_get_context' This is because of this: https://bugs.launchpad.net/ubuntu/+source/fuse/+bu= g/878612 so just edit the failing linking command which something like this : gcc -Wall -g -Wall -O3 -L/home/petru/work/ubeeko/hadoo.apache.org/0.23/hado= op-common/hadoop-hdfs-project/hadoop-hdfs/build/c++/Linux-i386-32/lib -lhdf= s -L/lib -lfuse -L/usr/local/java/jdk/jre/lib/i386/server -ljvm -o fuse_df= s fuse_dfs.o fuse_options.o fuse_trash.o fuse_stat_struct.o fuse_users.o fu= se_init.o fuse_connect.o fuse_impls_access.o fuse_impls_chmod.o fuse_impls_= chown.o fuse_impls_create.o fuse_impls_flush.o fuse_impls_getattr.o fuse_im= pls_mkdir.o fuse_impls_mknod.o fuse_impls_open.o fuse_impls_read.o fuse_imp= ls_release.o fuse_impls_readdir.o fuse_impls_rename.o fuse_impls_rmdir.o fu= se_impls_statfs.o fuse_impls_symlink.o fuse_impls_truncate.o fuse_impls_uti= mens.o fuse_impls_unlink.o fuse_impls_write.o by moving all the -L and -l part at the end, then: $ cd src/ $ gcc -Wall -g -Wall -O3 -o fuse_dfs fuse_dfs.o fuse_options.o fuse_trash.= o fuse_stat_struct.o fuse_users.o fuse_init.o fuse_connect.o fuse_impls_acc= ess.o fuse_impls_chmod.o fuse_impls_chown.o fuse_impls_create.o fuse_impls_= flush.o fuse_impls_getattr.o fuse_impls_mkdir.o fuse_impls_mknod.o fuse_imp= ls_open.o fuse_impls_read.o fuse_impls_release.o fuse_impls_readdir.o fuse_= impls_rename.o fuse_impls_rmdir.o fuse_impls_statfs.o fuse_impls_symlink.o = fuse_impls_truncate.o fuse_impls_utimens.o fuse_impls_unlink.o fuse_impls_w= rite.o -L/home/petru/work/ubeeko/hadoo.apache.org/0.23/hadoop-common/hadoop= -hdfs-project/hadoop-hdfs/build/c++/Linux-i386-32/lib -lhdfs -L/lib -lfuse = -L/usr/local/java/jdk/jre/lib/i386/server -ljvm Here, hope that helps someone. Don't just love autoconf? ________________________________ ::DISCLAIMER:: ---------------------------------------------------------------------------= -------------------------------------------- The contents of this e-mail and any attachment(s) are confidential and inte= nded for the named recipient(s) only. It shall not attach any liability on the originator or HCL or its affiliate= s. Any views or opinions presented in this email are solely those of the author and may not necessarily reflect t= he opinions of HCL or its affiliates. Any form of reproduction, dissemination, copying, disclosure, modification,= distribution and / or publication of this message without the prior written consent of the author of this e-mail= is strictly prohibited. If you have received this email in error please delete it and notify the sender immedia= tely. Before opening any mail and attachments please check them for viruses and defect. ---------------------------------------------------------------------------= -------------------------------------------- --_000_7D9AF4B98807C54EBADEA75DF6D5ACB7E32FFAEBNDAHCLCEVS04HCL_ Content-Type: text/html; charset="us-ascii" Content-Transfer-Encoding: quoted-printable

 <= /p>

Thanks Petru for sharing = this J

 <= /p>

From: Petru Dimulescu [mailto:petru.dimulescu@gmail.com= ]
Sent: Friday, December 09, 2011 9:35 PM
To: hdfs-user@hadoop.apache.org
Subject: 0.23 build fuse-dfs contrib

 

Hello, this mail orig= inated as a question, in the meantime I found the solution, so it might hel= p someone hopefully.

If you want to build fuse-dfs on 0,23 snapshot branch, on ubuntu linux 11.1= 0 (important, as you'll see:),
First you need to do a mvn package -Pnative in hadoop-common/hadoop-hdfs-pr= oject/hadoop-hdfs. You'll get a target/native/ subdir, go there and do a ma= ke install so that you'll have libhdfs.so in a system libdir.

Then I went to hadoop-hdfs/src/contrib and typed:

$ ant compile -Dfusedfs=3D1

it complained about not having hadoop-common/hadoop-hdfs-project/hadoop-hdf= s/ivy/libraries.properties -- that (empty) file is in hadoop-common/hadoop-= hdfs-project/hadoop-hdfs/src/contrib/fuse-dfs/ivy/libraries.properties). Af= ter I copied it in the expected place, I got :

   [exec] In file included from fuse_impls.h:26:0,
     [exec]       &n= bsp;          from fuse_dfs.c:= 21:
     [exec] fuse_context_handle.h:22:18: fatal error: h= dfs.h: No such file or directory

If I copied hdfs.h from the src/main/native dir to /usr/local/include, just= to make it happy quickly. Next error:

  [exec] fuse_impls_write.c: In function ‘dfs_write’:
     [exec] fuse_impls_write.c:38:16: warning: cast to = pointer from integer of different size [-Wint-to-pointer-cast]
     [exec] fuse_dfs.o: In function `is_protected':
     [exec] /home/petru/work/ubeeko/hadoo.apache.org/0.= 23/hadoop-common/hadoop-hdfs-project/hadoop-hdfs/src/contrib/fuse-dfs/src/f= use_dfs.c:27: undefined reference to `fuse_get_context'

This is because of this: https://bugs.launchpad.net/ubuntu/+source/fuse/+bug/878612

so just edit the failing linking command which something like this :

gcc -Wall -g -Wall -O3 -L/home/petru/work/ubeeko/hadoo.apache.org/0.23/hado= op-common/hadoop-hdfs-project/hadoop-hdfs/build/c++/Linux-i386-32/l= ib -lhdfs -L/lib -lfuse -L/usr/local/java/jdk/jre/lib/i386/server -ljvm&nbs= p; -o fuse_dfs fuse_dfs.o fuse_options.o fuse_trash.o fuse_stat_struct.o fuse_users.o fuse_init.o fuse_connect.o fuse_impls_acce= ss.o fuse_impls_chmod.o fuse_impls_chown.o fuse_impls_create.o fuse_impls_f= lush.o fuse_impls_getattr.o fuse_impls_mkdir.o fuse_impls_mknod.o fuse_impl= s_open.o fuse_impls_read.o fuse_impls_release.o fuse_impls_readdir.o fuse_impls_rename.o fuse_impls_rmdir.o fuse_impls_sta= tfs.o fuse_impls_symlink.o fuse_impls_truncate.o fuse_impls_utimens.o fuse_= impls_unlink.o fuse_impls_write.o

by moving all the -L and -l part at the end, then:

$ cd src/
$ gcc -Wall -g -Wall -O3  -o fuse_dfs fuse_dfs.o fuse_options.o fuse_t= rash.o fuse_stat_struct.o fuse_users.o fuse_init.o fuse_connect.o fuse_impl= s_access.o fuse_impls_chmod.o fuse_impls_chown.o fuse_impls_create.o fuse_i= mpls_flush.o fuse_impls_getattr.o fuse_impls_mkdir.o fuse_impls_mknod.o fuse_impls_open.o fuse_impls_read.o fuse_impls_release.= o fuse_impls_readdir.o fuse_impls_rename.o fuse_impls_rmdir.o fuse_impls_st= atfs.o fuse_impls_symlink.o fuse_impls_truncate.o fuse_impls_utimens.o fuse= _impls_unlink.o fuse_impls_write.o -L/home/petru/work/ubeeko/hadoo.apache.org/0.23/hadoop-common/hadoop-hdfs-= project/hadoop-hdfs/build/c++/Linux-i386-32/lib -lhdfs -L/lib -lfus= e -L/usr/local/java/jdk/jre/lib/i386/server -ljvm

Here, hope that helps someone. Don't just love autoconf?



::DISCLAIMER::
---------------------------------------------------------------------------= --------------------------------------------

The contents of this e-mail and any attachment(s) are confidential and inte= nded for the named recipient(s) only.
It shall not attach any liability on the originator or HCL or its affiliate= s. Any views or opinions presented in
this email are solely those of the author and may not necessarily reflect t= he opinions of HCL or its affiliates.
Any form of reproduction, dissemination, copying, disclosure, modification,= distribution and / or publication of
this message without the prior written consent of the author of this e-mail= is strictly prohibited. If you have
received this email in error please delete it and notify the sender immedia= tely. Before opening any mail and
attachments please check them for viruses and defect.

---------------------------------------------------------------------------= --------------------------------------------
--_000_7D9AF4B98807C54EBADEA75DF6D5ACB7E32FFAEBNDAHCLCEVS04HCL_--