hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alberto Cordioli <cordioli.albe...@gmail.com>
Subject Re: MapReduce jobs from remote
Date Fri, 21 Sep 2012 07:30:15 GMT
Bertrand, I read about the plugin, but it seems to work only with
Hadoop 0.20.2 and I am currently working with Hadoop 1.0.3

Harsh, it's not really clear to me why you're saying that it's enough
to have the config file in the classpath.
Also the jar is required, right? Otherwise I get
ClassNotFoundException. Am I missing something?

Thank you very much,
Alberto


On 20 September 2012 17:18, Harsh J <harsh@cloudera.com> wrote:
> You have already found the simplest way of doing this, I think. The
> other way may be to use Oozie if your job jars don't change much and
> can be staged directly into HDFS for ready submit-when-required.
>
> However, if you wanna run from Eclipse, you just need the config file
> resources on the classpath of the run configuration of your job
> project. That should work well enough.
>
> On Thu, Sep 20, 2012 at 7:44 PM, Alberto Cordioli
> <cordioli.alberto@gmail.com> wrote:
>> Hi all,
>>
>> Now I'd like to deploy a simple MapReduce job, written in Java, to a
>> remote cluster within Eclipse.
>> For the moment I've found this solution:
>>
>> 1) Put the hadoop conf file in the classpath
>> 1) Put the jar containing the job in the classpath.
>> 2) Run
>>
>> If I don't put the jar in the classpath when I run the job it returns
>> a ClassNotFoundException:
>> java.lang.RuntimeException: java.lang.ClassNotFoundException:  <MyMapperClass>
>>
>> I was wondering if there are other methods to do so in simpler way.
>>
>>
>> Thank you very much,
>> Alberto
>
>
>
> --
> Harsh J



-- 
Alberto Cordioli

Mime
View raw message