ignite-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Alexey Zinoviev (Jira)" <j...@apache.org>
Subject [jira] [Assigned] (IGNITE-10286) [ML] Umbrella: Model serving
Date Tue, 19 Nov 2019 13:46:00 GMT

     [ https://issues.apache.org/jira/browse/IGNITE-10286?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel

Alexey Zinoviev reassigned IGNITE-10286:

    Assignee: Alexey Zinoviev  (was: Yury Babak)

> [ML] Umbrella: Model serving
> ----------------------------
>                 Key: IGNITE-10286
>                 URL: https://issues.apache.org/jira/browse/IGNITE-10286
>             Project: Ignite
>          Issue Type: New Feature
>          Components: ml
>            Reporter: Yury Babak
>            Assignee: Alexey Zinoviev
>            Priority: Major
>             Fix For: 2.8
> We want to have convenient API for model serving. It means that we need a mechanism for
storing models and infer them inside Apache Ignite.
> For now, I see 2 important features - distributed storage for any models and inference.
> From my point of view, we could use some built-in(predefined) cache as model storage.
And use service grid for model inference. We could implement some "ModelService" for access
to our storage, receive the list of all suitable model(including model metrics and some other
information about a model), choose one(or several) and infer it from this service.
> Model from TF should also use the same mechanisms for storing and inference.

This message was sent by Atlassian Jira

View raw message