hadoop-general mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From amit kumar verma <v.a...@verchaska.com>
Subject Re: jni files
Date Fri, 09 Jul 2010 09:09:04 GMT
  Hi Hemant,

The version are same as copied it to all client machine.

I think I got a solution. As I read more about hadoop and JNI, I learned 
that I need to copy jni files to 
HADOOP_INSTALLATION_DIR//lib/native/Linux-xxx-xxx. I though my linux 
machine is Linux-i386-32. then I found in 
"org.apache.hadoop.util.PlatformName" class gives you your machine type 
and its Linux-amd64-64 and asa I copied jni files to this directory 
error are not coming.

Though full code is still not running as I developed the application 
using java.file class and i am still thinking how to make changes so 
that it can access hdfs !!!  Do i need to change my all API with respect 
to HDFS and rewrite using hadoop fs or ??!!!

It will be great if someone advice on this.



Thanks,
Amit Kumar Verma
Verchaska Infotech Pvt. Ltd.



On 07/09/2010 02:04 PM, Hemanth Yamijala wrote:
> Hi,
>
> Possibly another silly question, but can you cross check if the
> versions of Hadoop on the client and the server are the same ?
>
> Thanks
> hemanth
>
> On Thu, Jul 8, 2010 at 10:57 PM, Allen Wittenauer
> <awittenauer@linkedin.com>  wrote:
>> On Jul 8, 2010, at 1:08 AM, amit kumar verma wrote:
>>
>>>      DistributedCache.addCacheFile("hdfs://*
>>>      /192.168.0.153:50075*/libraries/mylib.so.1#mylib.so", conf);
>> Do you actually have asterisks in this?  If so, that's the problem.
>>
>>

Mime
View raw message