hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Harsh J <ha...@cloudera.com>
Subject Re: MapReduce jobs from remote
Date Thu, 20 Sep 2012 15:18:35 GMT
You have already found the simplest way of doing this, I think. The
other way may be to use Oozie if your job jars don't change much and
can be staged directly into HDFS for ready submit-when-required.

However, if you wanna run from Eclipse, you just need the config file
resources on the classpath of the run configuration of your job
project. That should work well enough.

On Thu, Sep 20, 2012 at 7:44 PM, Alberto Cordioli
<cordioli.alberto@gmail.com> wrote:
> Hi all,
>
> Now I'd like to deploy a simple MapReduce job, written in Java, to a
> remote cluster within Eclipse.
> For the moment I've found this solution:
>
> 1) Put the hadoop conf file in the classpath
> 1) Put the jar containing the job in the classpath.
> 2) Run
>
> If I don't put the jar in the classpath when I run the job it returns
> a ClassNotFoundException:
> java.lang.RuntimeException: java.lang.ClassNotFoundException:  <MyMapperClass>
>
> I was wondering if there are other methods to do so in simpler way.
>
>
> Thank you very much,
> Alberto



-- 
Harsh J

Mime
View raw message