hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alberto Cordioli <cordioli.albe...@gmail.com>
Subject Re: MapReduce jobs from remote
Date Fri, 21 Sep 2012 13:40:41 GMT
But, the call setJarByClass(Class) does not build the jar itself, right?
The compiled jar should be in any way in the classpath.

Actually, I've tried adding this command, but I still get
ClassNotFoundException if I don't put the jar in the run classpath.


Alberto


On 21 September 2012 12:39, Harsh J <harsh@cloudera.com> wrote:
> Hi Alberto,
>
> When doing a "hadoop jar <jar>" you'll of course need a compiled jar, yes.
>
> When running from Eclipse the job class files are on your classpath
> already. If you use the job.setJarByClass(Driver.class); call in your
> submitter, it will ensure it can pack up all job-required jars
> properly and ship it as the job jar when you call submit()/runJob().
> Is this something you're missing to do?
>
> On Fri, Sep 21, 2012 at 1:00 PM, Alberto Cordioli
> <cordioli.alberto@gmail.com> wrote:
>> Bertrand, I read about the plugin, but it seems to work only with
>> Hadoop 0.20.2 and I am currently working with Hadoop 1.0.3
>>
>> Harsh, it's not really clear to me why you're saying that it's enough
>> to have the config file in the classpath.
>> Also the jar is required, right? Otherwise I get
>> ClassNotFoundException. Am I missing something?
>>
>> Thank you very much,
>> Alberto
>>
>>
>> On 20 September 2012 17:18, Harsh J <harsh@cloudera.com> wrote:
>>> You have already found the simplest way of doing this, I think. The
>>> other way may be to use Oozie if your job jars don't change much and
>>> can be staged directly into HDFS for ready submit-when-required.
>>>
>>> However, if you wanna run from Eclipse, you just need the config file
>>> resources on the classpath of the run configuration of your job
>>> project. That should work well enough.
>>>
>>> On Thu, Sep 20, 2012 at 7:44 PM, Alberto Cordioli
>>> <cordioli.alberto@gmail.com> wrote:
>>>> Hi all,
>>>>
>>>> Now I'd like to deploy a simple MapReduce job, written in Java, to a
>>>> remote cluster within Eclipse.
>>>> For the moment I've found this solution:
>>>>
>>>> 1) Put the hadoop conf file in the classpath
>>>> 1) Put the jar containing the job in the classpath.
>>>> 2) Run
>>>>
>>>> If I don't put the jar in the classpath when I run the job it returns
>>>> a ClassNotFoundException:
>>>> java.lang.RuntimeException: java.lang.ClassNotFoundException:  <MyMapperClass>
>>>>
>>>> I was wondering if there are other methods to do so in simpler way.
>>>>
>>>>
>>>> Thank you very much,
>>>> Alberto
>>>
>>>
>>>
>>> --
>>> Harsh J
>>
>>
>>
>> --
>> Alberto Cordioli
>
>
>
> --
> Harsh J



-- 
Alberto Cordioli

Mime
View raw message