hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Raghavendra K" <raghavendr...@gmail.com>
Subject Re: Problem with LibHDFS
Date Tue, 26 Feb 2008 10:56:17 GMT
Hi,
  Thanks a lot for your reply.
I have added hadoop-core.jar and /conf to CLASSPATH but still I am getting
the same error.

test-libhdfs.sh: line 83:  8396 Segmentation fault      (core dumped)
CLASSPATH=$HADOOP_HOME/conf:$CLASSPATH
LD_PRELOAD="$HADOOP_HOME/libhdfs/libhdfs.so" $LIBHDFS_BUILD_DIR/$HDFS_TEST

What should I do?
Is there any way out?


On Fri, Feb 22, 2008 at 11:06 PM, Arun C Murthy <acm@yahoo-inc.com> wrote:

>
> On Feb 21, 2008, at 3:29 AM, Raghavendra K wrote:
>
> > Hi,
> >   I am able to get Hadoop running and also able to compile the
> > libhdfs.
> > But when I run the hdfs_test program it is giving Segmentation Fault.
>
> Unfortunately the documentation for using libhdfs is sparse, our
> apologies.
>
> You'll need to set the CLASSPATH to include your hadoop-core.jar and
> the conf/ directory to run libhdfs since it is a JNI wrapper over the
> HDFS java api.
>
> Please take a look at the 'test-libhdfs' target in the top-level
> build.xml for hints on how to set them...
>
> Arun
>
> > Just a small program like this
> > #include "hdfs.h"
> > int main() {
> > return(0);
> > }
> > and compiled using the command
> > gcc -ggdb -m32 -I/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/
> > include
> > -I/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/include/
> > hdfs_test.c
> > -L/garl/garl-alpha1/home1/raghu/Desktop/hadoop-0.15.3/libhdfs -lhdfs
> > -L/garl/garl-alpha1/home1/raghu/Desktop/jre1.5.0_14/lib/i386/server
> > -ljvm
> > -shared -m32 -Wl,-x -o hdfs_test
> > running hdfs_test gives segmentation fault.
> > please tell me as to how to fix it.
> >
> >
> >
> > --
> > Regards,
> > Raghavendra K
>
>


-- 
Regards,
Raghavendra K

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message