hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Michael Bieniosek <mich...@powerset.com>
Subject Re: Remote Job Submission
Date Fri, 23 May 2008 22:50:49 GMT
You could set up an rpc server on a machine that does have hadoop installed.
Then, your clients could submit rpc requests to this machine, and your rpc
server would resubmit the job to hadoop.


On 5/23/08 2:10 PM, "Natarajan, Senthil" <senthil@pitt.edu> wrote:

> The client machine doesn't have Hadoop installed and it is not a slave
> machine.
> From the client machine data and task nodes are not seen.
> In this scenario how to load data to HDFS and submit the MapReduce job from
> client.
> Is it possible?
> If not what minimal things need to be setup so that the data and jobs can be
> submitted remotely from the client machine.
> Thanks,
> Senthil
> -----Original Message-----
> From: Ted Dunning [mailto:tdunning@veoh.com]
> Sent: Friday, May 23, 2008 4:52 PM
> To: core-user@hadoop.apache.org; 'hadoop-user@lucene.apache.org'
> Subject: Re: Remote Job Submission
> Both are possible.  You may have to have access to the data and task nodes
> for some operations.  If you can see all of the nodes in your cluster, you
> should be able to do everything.
> On 5/23/08 1:46 PM, "Natarajan, Senthil" <senthil@pitt.edu> wrote:
>> Hi,
>> I was wondering is it possible to submit MapReduce job on remote Hadoop
>> cluster.
>> (i.e) Submitting the job from the machine which doesn't have Hadoop installed
>> and submitting to different machine where Hadoop installed.
>> Is it possible to do this?
>> I guess at least data can be uploaded to HDFS through java program remotely
>> right?
>> Thanks,
>> Senthil

View raw message