spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Michal Malohlava (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (SPARK-3270) Spark API for Application Extensions
Date Wed, 17 Sep 2014 23:00:37 GMT

     [ https://issues.apache.org/jira/browse/SPARK-3270?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Michal Malohlava updated SPARK-3270:
------------------------------------
    Description: 
Any application should be able to enrich spark infrastructure by services which are not available
by default.  

Hence, to support such application extensions (aka "extesions"/"plugins") Spark platform should
provide:
  - an API to register an extension 
  - an API to register a "service" (meaning provided functionality)
  - well-defined points in Spark infrastructure which can be enriched/hooked by extension
  - a way of deploying extension (for example, simply putting the extension on classpath and
using Java service interface)
  - a way to access extension from application

Overall proposal is available here: https://docs.google.com/document/d/1dHF9zi7GzFbYnbV2PwaOQ2eLPoTeiN9IogUe4PAOtrQ/edit?usp=sharing

Note: In this context, I do not mean reinventing OSGi (or another plugin platform) but it
can serve as a good starting point.


  was:
At the begining, let's clarify my motivation - I would like to extend Spark platform by an
embedded application (e.g., monitoring network performance in the context of selected applications)
which will be launched on particular nodes in cluster with their launch.
Nevertheless, I do not want to modify Spark code directly and hardcode my code in, but I would
prefer to provide a jar which would be registered and launched by Spark itself. 

Hence, to support such 3rd party applications (aka "extesions"/"plugins") Spark platform should
provide at least:
  - an API to register an extension 
  - an API to register a "service" (meaning provided functionality)
  - well-defined points in Spark infrastructure which can be enriched/hooked by extension
     - in master/worker lifecycle
     - in applications lifecycle
     - in RDDs lifecycle
     - monitoring/reporting
     - ...
  - a way of deploying extension (for example, simply putting the extension on classpath and
using Java service interface)

In this context, I do not mean reinventing OSGi (or another plugin platform) but it can serve
as a good starting point.




> Spark API for Application Extensions
> ------------------------------------
>
>                 Key: SPARK-3270
>                 URL: https://issues.apache.org/jira/browse/SPARK-3270
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Michal Malohlava
>
> Any application should be able to enrich spark infrastructure by services which are not
available by default.  
> Hence, to support such application extensions (aka "extesions"/"plugins") Spark platform
should provide:
>   - an API to register an extension 
>   - an API to register a "service" (meaning provided functionality)
>   - well-defined points in Spark infrastructure which can be enriched/hooked by extension
>   - a way of deploying extension (for example, simply putting the extension on classpath
and using Java service interface)
>   - a way to access extension from application
> Overall proposal is available here: https://docs.google.com/document/d/1dHF9zi7GzFbYnbV2PwaOQ2eLPoTeiN9IogUe4PAOtrQ/edit?usp=sharing
> Note: In this context, I do not mean reinventing OSGi (or another plugin platform) but
it can serve as a good starting point.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message