Return-Path: Delivered-To: apmail-hadoop-general-archive@minotaur.apache.org Received: (qmail 22695 invoked from network); 8 Jul 2010 08:09:05 -0000 Received: from unknown (HELO mail.apache.org) (140.211.11.3) by 140.211.11.9 with SMTP; 8 Jul 2010 08:09:05 -0000 Received: (qmail 50706 invoked by uid 500); 8 Jul 2010 08:09:04 -0000 Delivered-To: apmail-hadoop-general-archive@hadoop.apache.org Received: (qmail 50675 invoked by uid 500); 8 Jul 2010 08:09:01 -0000 Mailing-List: contact general-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: general@hadoop.apache.org Delivered-To: mailing list general@hadoop.apache.org Received: (qmail 50667 invoked by uid 99); 8 Jul 2010 08:09:00 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 08 Jul 2010 08:09:00 +0000 X-ASF-Spam-Status: No, hits=3.8 required=10.0 tests=HTML_MESSAGE,RCVD_IN_BRBL_LASTEXT,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of v.amit@verchaska.com designates 203.123.141.155 as permitted sender) Received: from [203.123.141.155] (HELO mail.verchaska.com) (203.123.141.155) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 08 Jul 2010 08:08:52 +0000 Received: from localhost (localhost.localdomain [127.0.0.1]) by mail.verchaska.com (Postfix) with ESMTP id F095311DBAF4 for ; Thu, 8 Jul 2010 13:37:06 +0530 (IST) X-Virus-Scanned: amavisd-new at mail.verchaska.com Received: from mail.verchaska.com ([127.0.0.1]) by localhost (mail.verchaska.com [127.0.0.1]) (amavisd-new, port 10024) with ESMTP id utawjIFMvHOn for ; Thu, 8 Jul 2010 13:37:02 +0530 (IST) Received: from [192.168.0.186] (unknown [192.168.0.186]) by mail.verchaska.com (Postfix) with ESMTP id 4598B11DBAF0 for ; Thu, 8 Jul 2010 13:37:02 +0530 (IST) Message-ID: <4C35877A.3090705@verchaska.com> Date: Thu, 08 Jul 2010 13:38:26 +0530 From: amit kumar verma User-Agent: Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US; rv:1.9.2.4) Gecko/20100608 Thunderbird/3.1 MIME-Version: 1.0 To: general@hadoop.apache.org Subject: jni files Content-Type: multipart/alternative; boundary="------------050806000704060908020706" X-Virus-Checked: Checked by ClamAV on apache.org --------------050806000704060908020706 Content-Type: text/plain; charset=ISO-8859-1; format=flowed Content-Transfer-Encoding: 7bit Hi, I developed a project which is using some native jni files (liblemur_jni.so), earlier i use to run application jar by using -Djava.library.path=/PATH_TO_JNI_FILES, but am not able to the same with ./hadoop jar command. I followed http://hadoop.apache.org/common/docs/r0.18.3/native_libraries.html 1. First copy the library to the HDFS. bin/hadoop fs -copyFromLocal mylib.so.1 /libraries/mylib.so.1 2. The job launching program should contain the following: DistributedCache.createSymlink(conf); DistributedCache.addCacheFile("hdfs://* /192.168.0.153:50075*/libraries/mylib.so.1#mylib.so", conf); 3. The map/reduce task can contain: System.loadLibrary("mylib.so"); but getting error : Exception in thread "main" java.io.IOException: Call to* /192.168.0.153:50075* failed on local exception: java.io.EOFException at org.apache.hadoop.ipc.Client.wrapException(Client.java:775) at org.apache.hadoop.ipc.Client.call(Client.java:743) at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220) at $Proxy1.getProtocolVersion(Unknown Source) at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359) at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106) at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:207) at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:170) at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196) at org.apache.hadoop.filecache.DistributedCache.getTimestamp(DistributedCache.java:506) at org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:640) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:761) at org.apache.hadoop.mapreduce.Job.submit(Job.java:432) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:447) at com.i4dweb.trobo.grid.WordCountNew.main(WordCountNew.java:49) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.RunJar.main(RunJar.java:156) Caused by: java.io.EOFException at java.io.DataInputStream.readInt(DataInputStream.java:375) at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:508) at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446) Please advice. -- Thanks, Amit Kumar Verma Verchaska Infotech Pvt. Ltd. --------------050806000704060908020706--