hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Peter Skomoroch <peter.skomor...@gmail.com>
Subject Re: Amazon Elastic MapReduce
Date Thu, 02 Apr 2009 19:01:26 GMT

The API accepts any arguments you can pass in the standard jobconf for
Hadoop 18.3, it is pretty easy to convert over an existing jobflow to a JSON
job description that will run on the service.


On Thu, Apr 2, 2009 at 2:44 PM, Kevin Peterson <kpeterson@biz360.com> wrote:

> So if I understand correctly, this is an automated system to bring up a
> hadoop cluster on EC2, import some data from S3, run a job flow, write the
> data back to S3, and bring down the cluster?
> This seems like a pretty good deal. At the pricing they are offering,
> unless
> I'm able to keep a cluster at more than about 80% capacity 24/7, it'll be
> cheaper to use this new service.
> Does this use an existing Hadoop job control API, or do I need to write my
> flows to conform to Amazon's API?

Peter N. Skomoroch

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message