hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Harsh J <ha...@cloudera.com>
Subject Re: MapReduce jobs from remote
Date Fri, 21 Sep 2012 14:13:44 GMT
Alberto,

Do you speak of this outside Eclipse? Of course the jar has to exist
for the classes to be found. If you do not want that, then you need to
either have the jar in the Hadoop's lib directories itself, or on HDFS
and submissions via Oozie.

On Fri, Sep 21, 2012 at 7:10 PM, Alberto Cordioli
<cordioli.alberto@gmail.com> wrote:
> But, the call setJarByClass(Class) does not build the jar itself, right?
> The compiled jar should be in any way in the classpath.
>
> Actually, I've tried adding this command, but I still get
> ClassNotFoundException if I don't put the jar in the run classpath.
>
>
> Alberto
>
>
> On 21 September 2012 12:39, Harsh J <harsh@cloudera.com> wrote:
>> Hi Alberto,
>>
>> When doing a "hadoop jar <jar>" you'll of course need a compiled jar, yes.
>>
>> When running from Eclipse the job class files are on your classpath
>> already. If you use the job.setJarByClass(Driver.class); call in your
>> submitter, it will ensure it can pack up all job-required jars
>> properly and ship it as the job jar when you call submit()/runJob().
>> Is this something you're missing to do?
>>
>> On Fri, Sep 21, 2012 at 1:00 PM, Alberto Cordioli
>> <cordioli.alberto@gmail.com> wrote:
>>> Bertrand, I read about the plugin, but it seems to work only with
>>> Hadoop 0.20.2 and I am currently working with Hadoop 1.0.3
>>>
>>> Harsh, it's not really clear to me why you're saying that it's enough
>>> to have the config file in the classpath.
>>> Also the jar is required, right? Otherwise I get
>>> ClassNotFoundException. Am I missing something?
>>>
>>> Thank you very much,
>>> Alberto
>>>
>>>
>>> On 20 September 2012 17:18, Harsh J <harsh@cloudera.com> wrote:
>>>> You have already found the simplest way of doing this, I think. The
>>>> other way may be to use Oozie if your job jars don't change much and
>>>> can be staged directly into HDFS for ready submit-when-required.
>>>>
>>>> However, if you wanna run from Eclipse, you just need the config file
>>>> resources on the classpath of the run configuration of your job
>>>> project. That should work well enough.
>>>>
>>>> On Thu, Sep 20, 2012 at 7:44 PM, Alberto Cordioli
>>>> <cordioli.alberto@gmail.com> wrote:
>>>>> Hi all,
>>>>>
>>>>> Now I'd like to deploy a simple MapReduce job, written in Java, to a
>>>>> remote cluster within Eclipse.
>>>>> For the moment I've found this solution:
>>>>>
>>>>> 1) Put the hadoop conf file in the classpath
>>>>> 1) Put the jar containing the job in the classpath.
>>>>> 2) Run
>>>>>
>>>>> If I don't put the jar in the classpath when I run the job it returns
>>>>> a ClassNotFoundException:
>>>>> java.lang.RuntimeException: java.lang.ClassNotFoundException:  <MyMapperClass>
>>>>>
>>>>> I was wondering if there are other methods to do so in simpler way.
>>>>>
>>>>>
>>>>> Thank you very much,
>>>>> Alberto
>>>>
>>>>
>>>>
>>>> --
>>>> Harsh J
>>>
>>>
>>>
>>> --
>>> Alberto Cordioli
>>
>>
>>
>> --
>> Harsh J
>
>
>
> --
> Alberto Cordioli



-- 
Harsh J

Mime
View raw message