hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mohammad Tariq <donta...@gmail.com>
Subject Re: Job launch from eclipse
Date Tue, 23 Apr 2013 13:46:47 GMT
Hell Han,

      The reason behind this is that the jobs are running inside the
Eclipse itself and not getting submitted to your cluster. Please see if
this links helps :
http://cloudfront.blogspot.in/2013/03/mapreduce-jobs-running-through-eclipse.html#.UXaQsDWH6IQ


Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com


On Tue, Apr 23, 2013 at 6:56 PM, shashwat shriparv <
dwivedishashwat@gmail.com> wrote:

> You need to generate a jar file, pass all the parameters on run time if
> any is fixed and run at hadoop like hadoop -jar jarfilename.jar <parameters>
>
> *Thanks & Regards    *
>
> ∞
> Shashwat Shriparv
>
>
>
> On Tue, Apr 23, 2013 at 6:51 PM, Han JU <ju.han.felix@gmail.com> wrote:
>
>> Hi,
>>
>> I'm getting my hands on hadoop. One thing I really want to know is how
>> you launch MR jobs in a development environment.
>>
>> I'm currently using Eclipse 3.7 with hadoop plugin from hadoop 1.0.2.
>> With this plugin I can manage HDFS and submit job to cluster. But the
>> strange thing is, every job launch from Eclipse in this way is not recorded
>> by the jobtracker (can't monitor it from web UI). But finally the output
>> appears in HDFS path as the parameter I gave. It's really strange that
>> makes me think it's a standalone job run then it writes output to HDFS.
>>
>> So how do you code and launch jobs to cluster?
>>
>> Many thanks.
>>
>> --
>> *JU Han*
>>
>> UTC   -  Université de Technologie de Compiègne
>> *     **GI06 - Fouille de Données et Décisionnel*
>>
>> +33 0619608888
>>
>
>

Mime
View raw message