mxnet-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Junru Shao <junrushao1...@gmail.com>
Subject Re: [DISCUSS] Rebrand Gluon to MXNet imperative or something MXNet.
Date Sat, 23 Mar 2019 01:45:42 GMT
@Lin Sure! Let's work together to promote MXNet Gluon, GluonNLP and GluonCV.

On Fri, Mar 22, 2019 at 6:44 PM Lin Yuan <apeforest@gmail.com> wrote:

> @Junru I fully agree with what you said. What I meant is we need to make
> more customers know about them.
>
> Lin
>
> On Fri, Mar 22, 2019 at 6:34 PM Junru Shao <junrushao1994@gmail.com>
> wrote:
>
> > @Lin I believe that the way to build a healthy community is to make both
> > customers and developers happy. In this case, I feel like the more
> > important thing about toolkits is to explain how useful they are to our
> > customers, rather than positions, components or anything else.
> >
> > As I mentioned above, the usefulness comes from two aspects (at least).
> >
> > 1) they provide state-of-the-art models and training techniques
> > out-of-the-box. If our customers want inference only, we have model zoo;
> If
> > our customers want to train on their own dataset, we have awesome
> training
> > tricks enclosed.
> >
> > 2) it provides exemplary codebase for anyone who wants to use Gluon
> > elegantly. It does help a lot for real-world development, compared with
> > simplest examples like tutorial.
> >
> >
> > On Fri, Mar 22, 2019 at 6:07 PM Junru Shao <junrushao1994@gmail.com>
> > wrote:
> >
> > > Probably we should figure out how to explain MXNet Gluon to customers.
> In
> > > this case, I agree with @Mu that
> > >
> > > 1) MXNet Gluon provides high-level API like what Keras gives to
> > TensorFlow.
> > >
> > > 2) MXNet Gluon supports hybridization, which unifies both symbolic and
> > > imperative programming style.
> > >
> > > Also, about toolkits, we could mention
> > >
> > > 3) GluonNLP and GluonCV are two awesome libraries in their respective
> > > domain, both of which are built on MXNet Gluon. They not only provide
> an
> > > awesome exemplary codebase for customers to learn the best way to use
> > MXNet
> > > Gluon, but also come with the state-of-the-art models and training
> > > techniques out-of-the-box.
> > >
> > > Any other ideas?
> > >
> > >
> > > On Fri, Mar 22, 2019 at 5:54 PM Pedro Larroy <
> > pedro.larroy.lists@gmail.com>
> > > wrote:
> > >
> > >> +1 to MXNet Gluon given the feedbacks and explanations from everyone
> so
> > >> far.
> > >>
> > >> On Fri, Mar 22, 2019 at 5:09 PM Junru Shao <junrushao1994@gmail.com>
> > >> wrote:
> > >> >
> > >> > I feel like MXNet Gluon is a good name. You don't lose customers who
> > >> have
> > >> > been familiar with MXNet, nor lose customers who are used to MXNet
> > >> symbolic.
> > >> >
> > >> > On Fri, Mar 22, 2019 at 5:07 PM Davydenko, Denis <
> > >> > dzianis.davydzenka@gmail.com> wrote:
> > >> >
> > >> > > As subject suggests this is a proposal for re-branding of Gluon
to
> > >> align
> > >> > > it with MXNet. One of the common things undertaken for re-branding
> > >> > > exercises is renaming. That's what my thinking behind suggesting
> new
> > >> name
> > >> > > for Gluon. I am sincerely curious what would be alternatives
to
> > >> rebrand
> > >> > > Gluon to align it with MXNet without changing its name.
> > >> > >
> > >> > >
> > >> > > On 3/22/19, 4:57 PM, "Mu Li" <muli.cmu@gmail.com> wrote:
> > >> > >
> > >> > >     Are you proposing to rename Gluon? I think Pedro's opinion
is
> > >> about a
> > >> > >     better way to communicate what's Gluon and how it's related
to
> > >> MXNet.
> > >> > >
> > >> > >     On Fri, Mar 22, 2019 at 4:54 PM Davydenko, Denis
> > >> > > <dden@amazon.com.invalid>
> > >> > >     wrote:
> > >> > >
> > >> > >     > I support idea of putting brands of MXNet and Gluon
closer
> > >> together.
> > >> > > I
> > >> > >     > agree with your argument, Mu, but MXNet is quite far
away
> from
> > >> TF
> > >> > > place at
> > >> > >     > this time so I don’t know how well that argument is
> > transferable
> > >> > > from TF
> > >> > >     > position to MXNet position.
> > >> > >     >
> > >> > >     > MXNet Imperative is definitely too restrictive of a
name, we
> > can
> > >> > > come up
> > >> > >     > with better one... MXNet-M for example, stands for
> > >> MXNet-Modified
> > >> > > (military
> > >> > >     > connotation). If naming is the only thing we need to
figure
> > out
> > >> -
> > >> > > that is a
> > >> > >     > good place to be in __
> > >> > >     >
> > >> > >     > --
> > >> > >     > Thanks,
> > >> > >     > Denis
> > >> > >     >
> > >> > >     > On 3/22/19, 4:48 PM, "Mu Li" <muli.cmu@gmail.com>
wrote:
> > >> > >     >
> > >> > >     >     Gluon is about imperative neural network training
and
> data
> > >> > > loading.
> > >> > >     > ndarray
> > >> > >     >     is another large imperative module. Besides, Gluon
also
> > >> supports
> > >> > >     > symbolic
> > >> > >     >     execution after hybridizing.  mxnet imperative might
not
> > be
> > >> a
> > >> > > good
> > >> > >     > name for
> > >> > >     >     it. Another choice is high-level API, that's how
TF
> talks
> > >> about
> > >> > > Keras.
> > >> > >     >
> > >> > >     >     On Fri, Mar 22, 2019 at 4:38 PM Yuan Tang <
> > >> > > terrytangyuan@gmail.com>
> > >> > >     > wrote:
> > >> > >     >
> > >> > >     >     > +1
> > >> > >     >     >
> > >> > >     >     > On Fri, Mar 22, 2019 at 7:29 PM Lin Yuan <
> > >> apeforest@gmail.com>
> > >> > >     > wrote:
> > >> > >     >     >
> > >> > >     >     > > +1.
> > >> > >     >     > >
> > >> > >     >     > > Just to give some of my real experience:
> > >> > >     >     > > 1) I advertised a recent GluonNLP blog
and many
> > >> responses are
> > >> > >     > "This seems
> > >> > >     >     > > nice. So is Gluon a new library to replace
MXNet?"
> > >> > >     >     > > 2) We visited customers in a unicorn company
who
> > showed
> > >> > > interests
> > >> > >     > in
> > >> > >     >     > MXNet
> > >> > >     >     > > but none of the engineers knew the relationship
> > between
> > >> > >     > GluonNLP/GluonCV
> > >> > >     >     > > and MXNet
> > >> > >     >     > > 3) When integrating MXNet to Horovod and
adding
> > >> examples, I
> > >> > >     > received
> > >> > >     >     > > comments like "What is Gluon? Is it a
new library in
> > >> > > addition to
> > >> > >     > MXNet?"
> > >> > >     >     > >
> > >> > >     >     > > Everyone is talking about PyTorch nowadays,
but not
> > >> Caffe2
> > >> > > anymore
> > >> > >     >     > although
> > >> > >     >     > > the latter is still serving as a backend
component.
> > >> Maybe we
> > >> > >     > should also
> > >> > >     >     > > doubledown on one brand?
> > >> > >     >     > >
> > >> > >     >     > > Lin
> > >> > >     >     > >
> > >> > >     >     > > On Fri, Mar 22, 2019 at 4:02 PM Pedro
Larroy <
> > >> > >     >     > pedro.larroy.lists@gmail.com
> > >> > >     >     > > >
> > >> > >     >     > > wrote:
> > >> > >     >     > >
> > >> > >     >     > > > Hi dev@
> > >> > >     >     > > >
> > >> > >     >     > > > We heard feedback from users that
the Gluon name
> is
> > >> > > confusing.
> > >> > >     > Some of
> > >> > >     >     > > > them don't even know it's MXNet and
it's unclear
> the
> > >> > >     > relationship with
> > >> > >     >     > > > MXNet
> > >> > >     >     > > >
> > >> > >     >     > > > Would it make sense to rebrand Gluon
to just MXNet
> > or
> > >> MXNet
> > >> > >     >     > > > imperative? Diluting brands and names
is never a
> > good
> > >> idea.
> > >> > >     >     > > >
> > >> > >     >     > > > There's also gluonhq which is related
to JavaFX
> > which
> > >> adds
> > >> > > to the
> > >> > >     >     > > > confusion, search engine friendliness
is not high
> as
> > >> well.
> > >> > >     >     > > >
> > >> > >     >     > > > Pedro.
> > >> > >     >     > > >
> > >> > >     >     > >
> > >> > >     >     >
> > >> > >     >
> > >> > >     >
> > >> > >     >
> > >> > >
> > >> > >
> > >> > >
> > >> > >
> > >>
> > >
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message