mxnet-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Kumar, Vikas" <viku...@amazon.com.INVALID>
Subject [Launch Announcement] Dynamic training with Apache MXNet
Date Thu, 29 Nov 2018 18:26:17 GMT
Hello MXNet community,

MXNet users can now use Dynamic Training(DT) for Deep learning models with Apache MXNet. DT
helps to reducing training cost and training time by adding elasticity to the distributed
training cluster. DT also helps in increasing instance pool utilization. With DT unused instances
can be used to speed up training and then instances can be removed from training cluster at
a later time to be used by some other application.
For details, refer to DT blog<https://aws.amazon.com/blogs/machine-learning/introducing-dynamic-training-for-deep-learning-with-amazon-ec2/>.
Developers should be able to integrate Dynamic training in their existing distributed training
code, with introduction of few extra lines of code<https://github.com/awslabs/dynamic-training-with-apache-mxnet-on-aws#writing-a-distributed-training-script>.

Thank you for all the contributors – Vikas Kumar <https://github.com/Vikas89 >, Haibin
Lin < https://github.com/eric-haibin-lin>, Andrea Olgiati < https://github.com/andreaolgiati/><https://github.com/andreaolgiati/>
, Mu Li < https://github.com/mli >, Hagay Lupesko <https://github.com/lupesko>,
Markham Aaron < https://github.com/aaronmarkham > , Sergey Sokolov < https://github.com/Ishitori>

This is an effort towards making training neural networks cheap and fast. We welcome your
contributions to the repo - https://github.com/awslabs/dynamic-training-with-apache-mxnet-on-aws
. We would love to hear feedback and ideas in this direction.

Thanks
Vikas
Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message