hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From amit kumar verma <v.a...@verchaska.com>
Subject Fwd: jni files
Date Thu, 08 Jul 2010 09:22:15 GMT


-------- Original Message --------
Subject: 	jni files
Date: 	Thu, 08 Jul 2010 13:38:26 +0530
From: 	amit kumar verma <v.amit@verchaska.com>
Reply-To: 	general@hadoop.apache.org
To: 	general@hadoop.apache.org



   Hi,

I developed a project which is using some native jni files
(liblemur_jni.so), earlier i use to run application jar by using
-Djava.library.path=/PATH_TO_JNI_FILES, but am not able to the same with
./hadoop jar command.

I followed
http://hadoop.apache.org/common/docs/r0.18.3/native_libraries.html

    1. First copy the library to the HDFS.
       bin/hadoop fs -copyFromLocal mylib.so.1 /libraries/mylib.so.1
    2. The job launching program should contain the following:
       DistributedCache.createSymlink(conf);
       DistributedCache.addCacheFile("hdfs://*
       /192.168.0.153:50075*/libraries/mylib.so.1#mylib.so", conf);
    3. The map/reduce task can contain:
       System.loadLibrary("mylib.so");

   but getting error :

Exception in thread "main" java.io.IOException: Call to*
/192.168.0.153:50075* failed on local exception: java.io.EOFException
          at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
          at org.apache.hadoop.ipc.Client.call(Client.java:743)
          at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
          at $Proxy1.getProtocolVersion(Unknown Source)
          at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
          at
org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
          at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
          at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
          at
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
          at
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
          at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
          at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
          at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
          at
org.apache.hadoop.filecache.DistributedCache.getTimestamp(DistributedCache.java:506)
          at
org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:640)
          at
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:761)
          at org.apache.hadoop.mapreduce.Job.submit(Job.java:432)
          at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:447)
          at com.i4dweb.trobo.grid.WordCountNew.main(WordCountNew.java:49)
          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
          at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
          at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
          at java.lang.reflect.Method.invoke(Method.java:597)
          at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.io.EOFException
          at java.io.DataInputStream.readInt(DataInputStream.java:375)
          at
org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:508)
          at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)


Please advice.

-- 
Thanks,
Amit Kumar Verma
Verchaska Infotech Pvt. Ltd.





Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message