mxnet-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Qing Lan <lanking...@live.com>
Subject New Java Inference API
Date Tue, 04 Sep 2018 18:11:26 GMT
Hi All,

Here is an update for the Java Inference API design doc on CWIKI: https://cwiki.apache.org/confluence/display/MXNET/MXNet+Java+Inference+API.
Currently, MXNet Java bindings is an extension of MXNet Scala API that allow users to use
Java to do inference on MXNet. Users will be able to import pre-trained MXNet model and do
single/batch inference on it.

Please take a look the design document again and feel free to leave any thoughts you have.

Thanks,
Qing

´╗┐On 5/10/18, 11:08 AM, "Andrew Ayres" <andrew.f.ayres@gmail.com> wrote:

    Hi Kellen,
    
    Thanks for the feedback. You bring up an interesting idea about the
    dependencies. I'll add that to the list of things to look into.
    
    As for the threading, my current thinking is that we implement a dispatcher
    thread like suggested in the Scala threading discussion
    https://discuss.mxnet.io/t/fixing-thread-safety-issues-in-scala-library/236.
    I would definitely like to hide such complexities from the user.
    
    Andrew
    
    
    On Thu, May 10, 2018 at 3:22 AM, kellen sunderland <
    kellen.sunderland@gmail.com> wrote:
    
    > Hey Andrew, thanks for the write-up.  I think having a Java binding will be
    > very useful for enterprise users.  Doc looks good but two things I'm
    > curious about:
    >
    > How are you planning to handle thread safe inference?   It'll be great if
    > you can hide the complexity of dealing with dispatch threading from users.
    >
    > The other thing I think a solid Java API could provide is a limited number
    > of dependencies.  There's some simple things we can do to make this happen
    > (create a statically linked, portable so) but there's also some complexity
    > around minimizing dependencies MXNet.  For example we'll likely want to
    > release MKL flavoured binaries, we should have a few versions of CUDA
    > supported.  We could try and have one version that has an absolute minimum
    > of dependencies (maybe statically linking with openblas).  It might be good
    > to document exactly the packages you're planning to release, and give some
    > more details about what the dependencies for the packages would be.
    >
    > Many thanks for looking into this, I think it'll be a big improvement for
    > many of our users.
    >
    > -Kellen
    >
    > On Thu, May 10, 2018, 12:57 AM Andrew Ayres <andrew.f.ayres@gmail.com>
    > wrote:
    >
    > > Hi all,
    > >
    > > There has been a lot of interest expressed in having a Java API for doing
    > > inference. The general idea is that after training a model using python,
    > > users would like to be able to load the model for inference inside their
    > > existing production eco-system.
    > >
    > > We've begun exploring a few options for the implementation at <
    > > https://cwiki.apache.org/confluence/display/MXNET/
    > MXNet+Java+Inference+API
    > > >
    > > and would appreciate any insights/feedback.
    > >
    > > Thanks,
    > > Andrew
    > >
    >
    

Mime
View raw message