airavata-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Danushka Menikkumbura <danushka.menikkumb...@gmail.com>
Subject Re: XBaya/Hadoop Integration - Concern
Date Fri, 21 Jun 2013 23:44:35 GMT
Hadoop deployment model (single node, local cluster, EMR, etc) is not
exactly a host, as in Airavata, but is along the lines of host IMO.
Therefore we can still stick to a similar model but need to have a
different UI interface to configure them. Still Hadoop jobs would be
treated differently and have them configured in workflow itself (i.e. the
current implementation), as opposed to having them predefined as in GFac
applications.

Please kindly let me know if you think otherwise.

Cheers,
Danushka


On Wed, Jun 19, 2013 at 12:57 AM, Danushka Menikkumbura <
danushka.menikkumbura@gmail.com> wrote:

> Hi All,
>
> The current UI implementation does not take application/host description
> into account simply because they have little or no meaning in the Hadoop
> world as I believe. The current implementation enables configuring each
> individual job using the UI (Please see the attached xbaya-hadoop.png).
>
> The upside of this approach is that new jobs could be added/configured
> dynamically, without adding application descriptions/generating
> code/compiling/re-deploying/etc. The downside is that it is different from
> general GFac application invocation, where each application has an
> associated application/host/etc. Nevertheless we are trying to incorporate
> something that does not quite fit into application/host domain.
>
> Thoughts appreciated.
>
> Thanks,
> Danushka
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message