hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From fateme Abiri <fateme.ab...@yahoo.com>
Subject submit job to remote hadoop cluster
Date Sun, 03 Nov 2013 14:57:47 GMT

I want to run a map reduce job from an IDE(NetBeans) and as a client, my hadoop  cluster
is in different machine, so I set below code in my job conf to run a job in remote :

 config = new Configuration();
 config.set("fs.default.name", "hdfs://");
 config.set("mapred.job.tracker", "");
Job job = new Job(config, "myJob");  

 my Reducer.class &  Reducer.class is in  MyMapReducClass.class...

know I want to run my code directly from IDE(NetBeans) , without using command: 
$hadoop jar ….
I mean I use ToolRunner.run(new MyMapReducClass() , args) in my code.

but when I run the code I have this Error:

 attempt_201311031101_0008_m_000008_0, Status : FAILED
java.lang.RuntimeException: java.lang.ClassNotFoundException:

 as I said I do : job.setJarByClass(MyMapReducClass.class);
and the Mapper.class is in  MyMapReducClass!!!!

If its necessary to set $Hadoop ClassPath, is it possible to set it from my IDE(NetBeans)?
Because of some security reason I don't have any permission to ssh to hadoop cluster to set
this parameter every time for all program execution !!!

in general: 

what pre-requirestics should I do  to run map-reduce job from IDE(NetBeans) in mylocal machine
on remote cluster?
Should I copy the jar file of  my classes and library in hadoop cluster? I have them only
in my local machine and do not copy them on hadoop cluster!!!

please say me what can I do?

– Tanx so much
View raw message