ignite-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Yury Babak (JIRA)" <j...@apache.org>
Subject [jira] [Created] (IGNITE-10286) [ML] Umbrella: Model serving
Date Thu, 15 Nov 2018 22:27:00 GMT
Yury Babak created IGNITE-10286:

             Summary: [ML] Umbrella: Model serving
                 Key: IGNITE-10286
                 URL: https://issues.apache.org/jira/browse/IGNITE-10286
             Project: Ignite
          Issue Type: New Feature
          Components: ml
            Reporter: Yury Babak
            Assignee: Yury Babak

We want to have convenient API for model serving. It means that we need a mechanism for storing
models and infer them inside Apache Ignite.

For now, I see 2 important features - distributed storage for any models and inference.

>From my point of view, we could use some built-in(predefined) cache as model storage.
And use service grid for model inference. We could implement some "ModelService" for access
to our storage, receive the list of all suitable model(including model metrics and some other
information about a model), choose one(or several) and infer it from this service.

Model from TF should also use the same mechanisms for storing and inference.

This message was sent by Atlassian JIRA

View raw message