mxnet-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jun Wu <wujun....@gmail.com>
Subject Implementing zero-dim and zero-size tensors in MXNet and its impact on your codebases
Date Thu, 11 Apr 2019 06:24:56 GMT
Dear Community,

A while ago, we sent out an RFC
<https://github.com/apache/incubator-mxnet/issues/14253> discussing the
initiative introducing NumPy compatibility into MXNet. As the first outcome
of this initiative, we submitted the PR
<https://github.com/apache/incubator-mxnet/pull/14661> providing the
infrastructure of supporting zero-dim (scalar) and zero-size tensors, which
have been long-missing in MXNet.

In our implementation, we have put the best efforts of keeping the promise
of backward compatibility in all the language bindings. Nevertheless, we
still would like to call out the changes explicitly that may impact your
existing codebases developed on top of MXNet by calling C-APIs directly or
implementing operators in your own repos.

1. In you application, if you called any one of the following shape-related
C-APIs, you will need to change the data type of shape's ndim and dim_size
from *unsigned int* to signed *int*, because we have to use -1 to represent
unknown shape information, and reserve 0 for scalar and zero-size tensors.
One example of such changes can be seen in the cpp-package
<https://github.com/apache/incubator-mxnet/pull/14661/files#diff-c0e77771fcfe1619faa4ff5f59d94e8bR183>
calling MXSymbolInferShape.
- MXSymbolInfershape
- MXSymbolInfershapePartial
- MXExecutorSimpleBind
- MXExecutorReshape
- MXNDArrayGetShape
- MXNDArrayCreaetFromSharedMem

2. If you have implemented operators in your own codebases, you will
probably need to change every operator's shape inference function to use
the following util functions to check whether shape information is known,
instead of checking against 0 directly. One example of such changes can be
seen in the shape inference function
<https://github.com/apache/incubator-mxnet/pull/14661/files#diff-afa640c4653c59f00f43a84455f91ef9R35>
of concat operator.
- shape_is_known (include/mxnet/tuple.h)
- ndim_is_known (include/mxnet/tuple.h)
- dim_size_is_known (include/mxnet/tuple.h)

If you are interested in knowing the value of scalar tensors, and hence
understanding our motivation further, this thread
<https://discuss.mxnet.io/t/rank-0-arrays-in-mxnet-aka-pi-is-wrong/108> of
discussion provides very good insights from the view of data science. It
was actually related to an opportunity for MXNet becoming the backend of
PyMC <https://en.wikipedia.org/wiki/PyMC3>, but somehow it didn't go
through due to missing several key features
<https://discuss.mxnet.io/t/moving-pymc3-from-theano-to-mxnet/86>, and
scalar tensors is one of them.

Please leave comments in the PR
<https://github.com/apache/incubator-mxnet/pull/14661> if you have any
concerns or suggestions of our work.

Thank you very much for your time and consideration.

Best,
Jun

*References*
[1] RFC of NumPy compatibility:
https://github.com/apache/incubator-mxnet/issues/14253
[2] Pull request of supporting scalar and zero-size tensors:
https://github.com/apache/incubator-mxnet/pull/14661
[3] The value of scalar tensors from the view of data science:
https://discuss.mxnet.io/t/rank-0-arrays-in-mxnet-aka-pi-is-wrong/108
[4] Previous discussion for MXNet becoming the backend of PyMC:
https://discuss.mxnet.io/t/moving-pymc3-from-theano-to-mxnet/86

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message