hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Segel, Mike" <mse...@navteq.com>
Subject RE: Starting a job on a hadoop cluster remotly
Date Wed, 28 Jul 2010 14:10:47 GMT
Hi,

Since you didn't get an answer... yes you can.

I'm working from memory so I may be a bit fuzzy on the details...

Your external app has to be 'cloud aware'. Essentially create a config file for your application
that you can read in which lets your app know where the JT and NN are.

Then you can use the Tool (?interface/class) implement/extend with your job class and tell
it to run...

Sorry for being terse... I'm under the gun on getting something out the door.

HTH

-Mike


-----Original Message-----
From: Sebastian Ruff (open4business GmbH) [mailto:ruff@open4business.de] 
Sent: Wednesday, July 28, 2010 7:27 AM
To: common-dev@hadoop.apache.org
Subject: Starting a job on a hadoop cluster remotly

Hey,

 

is it possible to start a job on a hadoop cluster from remote. For
example we have a web application

which runs on an apache tomcat server. And would like to start a
mapreduce job on our cluster, from

within the webapp.

 

Is this possible? And if yes, what are the steps to get there? Do I just
have to put my namenode and datanode

in a core-site.xml in the webapp and call the api?

 

Thanks a lot,

 

Sebastian



The information contained in this communication may be CONFIDENTIAL and is intended only for
the use of the recipient(s) named above.  If you are not the intended recipient, you are hereby
notified that any dissemination, distribution, or copying of this communication, or any of
its contents, is strictly prohibited.  If you have received this communication in error, please
notify the sender and delete/destroy the original message and any copy of it from your computer
or paper files.

Mime
View raw message