hadoop-hdfs-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Aastha Mehta <aasth...@gmail.com>
Subject Compiling hadoop native libraries
Date Mon, 01 Aug 2011 04:04:42 GMT

I am trying to run fuse_dfs_wrapper.sh from
hadoop-0.20.2/src/contrib/fuse_dfs/src on a 64-bit machine. I get the
following error:
./fuse_dfs: error while loading shared libraries: libhdfs.so.0: cannot open
shared object file: No such file or directory

I searched on the net and found a response to a similar query here -

My hadoop package contains the native files in

I followed to this link -
http://hadoop.apache.org/common/docs/current/native_libraries.html to
understand the steps to build hadoop native libraries.

I have a small query regarding the building step. On the above link, it is
mentioned -

"Once you installed the prerequisite packages use the standard hadoop
build.xml file and pass along the compile.native flag (set to true) to build
the native hadoop library:

$ ant -Dcompile.native=true <target>

You should see the newly-built library in:

$ build/native/<platform>/lib

where <platform> is a combination of the system-properties: ${os.name
}-${os.arch}-${sun.arch.data.model} (for example, Linux-i386-32)."

Could someone please tell what exactly is <target> in the first step.

Thanks and regards,


Aastha Mehta
B.E. (Hons.) Computer Science
BITS Pilani
E-mail: aasthakm@gmail.com

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message