airavata-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shameera Rathnayaka <shameerai...@gmail.com>
Subject Re: Airavata 0.16 Design Refactoring Suggestion?
Date Tue, 02 Jun 2015 22:03:00 GMT
Hi All,

Here is how the sequence diagram looks like for experiment submit request,
with the proposed changes.



​

Thanks,
Shameera.

On Sat, May 30, 2015 at 4:55 PM, Shameera Rathnayaka <shameera@apache.org>
wrote:

> Hi Devs,
>
> As we are about to release Airavata 0.15( already cut the branch ) we will
> not add any major changes and it is in testing stage. This will give us
> time to discuss and finalize requirements for the next release , it can be
> either 0.16 or 1.0.
>
>
> As per the feedback from our user community, they need more transparent
> view of what Airavata does when they submit an experiment to run a job on
> remote computer resource. Airavata users are science gateway developers,
> they are not only interested in Experiment level and remote Job level
> status changes. They would like to know some degree of transparency about
> pre-processing and post-processing tasks performed by airavata framework,
> before and after Job submission. For example they would like to see which
> task is being executed at particular time, does scp file transferring
> succeed or not. With current Hander architecture, it is not possible to
> Airavata framework to know which handler does what. User can write and
> integrate different kind of handlers and integrate it with the execution
> chain. If Airavata Job submission failed while transferring input file to
> the compute resource. Gateway developer should be able to find the reason
> without any trouble. Current Airavata save the failure reason with
> stracktrace but that is too low level for a gateway developer.
>
> Here we are thinking of replace this static handler architecture with
> dynamic task mechanism. Here framework has different type of tasks, lets
> say for input staging we have SCP , GRIDFTP and HTTP tasks. each task
> clearly know what it need to do and how. When Airavata get an experiment
> with three inputs, one is simple string and other two are SCP and HTTP type
> file transfer inputs. Then Airavata decide to add SCP and GRIDFTP tasks to
> the dynamic task chain. Then add another Job submission task, let's say job
> need to submit using ssh keys then Airavata add SSH job submission task. as
> same add required task for the outputs. Each task has three states
> Processing, Completed, Failed. In case of failure, framework know which
> type of works it was doing or which task failed, is it SCP file transfer
> task or GRIDFTP file transfering task. Then Airavata can provide(show) this
> details to Users by messaging. Please see following diagram to get an idea
> about different level of state transitions.
>
> Yours feedback are highly appreciate. ​
>
> ​
>
>
> Thanks,
> Shameera.
>
>


-- 
Best Regards,
Shameera Rathnayaka.

email: shameera AT apache.org , shameerainfo AT gmail.com
Blog : http://shameerarathnayaka.blogspot.com/

Mime
View raw message