mxnet-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Kumar, Vikas" <>
Subject Re: [Launch Announcement] Dynamic training with Apache MXNet
Date Thu, 29 Nov 2018 18:55:09 GMT
A big thanks to Qi Qiao < > for making it easy for users
to set up a cluster for dynamic training using cloudformation.

From: "Kumar, Vikas" <>
Date: Thursday, November 29, 2018 at 10:26 AM
To: "" <>
Subject: [Launch Announcement] Dynamic training with Apache MXNet

Hello MXNet community,

MXNet users can now use Dynamic Training(DT) for Deep learning models with Apache MXNet. DT
helps to reducing training cost and training time by adding elasticity to the distributed
training cluster. DT also helps in increasing instance pool utilization. With DT unused instances
can be used to speed up training and then instances can be removed from training cluster at
a later time to be used by some other application.
For details, refer to DT blog<>.
Developers should be able to integrate Dynamic training in their existing distributed training
code, with introduction of few extra lines of code<>.

Thank you for all the contributors – Vikas Kumar < >, Haibin
Lin <>, Andrea Olgiati <><>
, Mu Li < >, Hagay Lupesko <>,
Markham Aaron < > , Sergey Sokolov <>
, Qi Qiao < >

This is an effort towards making training neural networks cheap and fast. We welcome your
contributions to the repo -
. We would love to hear feedback and ideas in this direction.

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message