airavata-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Saminda Wijeratne <samin...@gmail.com>
Subject Persisting GFac job data
Date Tue, 21 May 2013 15:04:24 GMT
It has being apparent more and more that saving the data related to
executing a jobs from the GFac can be useful for many reasons such as,

debugging
retrying
to make smart decisions on reliability/cost etc.
statistical analysis

Thus we thought of saving the data related to GFac jobs in the registry in
order to facilitate feature such as above in the future.

However a GFac job is potentially any sort of computing resource access
(GRAM/UNICORE/EC2 etc.). Therefore we need to come up with a generalized
data structure that can hold the data of any type of resource. Following
are the suggested data to save for a single GFac job execution,

*experiment id, workflow instance id, node id* - pinpoint the node
execution
*service, host, application description ids *- pinpoint the descriptors
responsible
*local job id* - the unique job id retrieved/generated per execution
[PRIMARY KEY]
*job data* - data related executing the job (eg: the rsl in GRAM)
*submitted, completed time*
*completed status* - whether the job was successfull or ran in to errors
etc.
*metadata* - custom field to add anything user wants

Your feedback is most welcome. The API related changes will also be
discussed once we have a proper data structure. We are hoping to implement
this within next few days.

Thanks,
Saminda

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message