singa-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Wang Wei <wang...@apache.org>
Subject Re: [ANNOUNCE] Apache SINGA (incubating) 0.2.0 release
Date Sat, 16 Jan 2016 06:24:39 GMT
Hi Greg,

Thanks!
We have added the disclaimer in the download page, and will add it in the
announce email for the next release.

Best,
Wei

On Fri, Jan 15, 2016 at 8:23 PM, Greg Stein <gstein@gmail.com> wrote:

> In the future, you MUST include the Incubation Disclaimer in your release
> announcements. Unfortunately, your announcement email was moderated
> through, erroneously. Please correct your procedures for your next release.
>
> Your download page should also include the disclaimer.
>
> Thx,
> -g
>
> On Thu, Jan 14, 2016 at 9:42 PM, Wang Wei <wangwei@apache.org> wrote:
>
> > Hi,
> >
> > We are pleased to announce that Apache SINGA (incubating) 0.2.0 is
> > released.
> >
> > SINGA is a general distributed deep learning platform for training big
> > deep learning models over large datasets. It is designed with an
> intuitive
> > programming model based on the layer abstraction. SINGA supports a wide
> > variety of popular deep learning models.
> >
> > The release is available at:
> > http://singa.apache.org/downloads.html
> >
> > The main features of this release include
> >
> > * Training on GPU  -- enabling training of complex models on a single
> node
> > with multiple GPU cards.
> > * Hybrid neural net partitioning -- supporting data and model parallelism
> > at the same time.
> > * Python wrapper -- making it easier to configure jobs, including neural
> > net and SGD algorithm.
> > * RNN model and BPTT algorithm -- supporting applications based on RNN
> > models, e.g., GRU.
> > * Cloud software integration, including Mesos, Docker and HDFS.
> > * Visualization of neural net structure and layer information -- helpful
> > for debugging.
> > * Linear algebra functions and random functions against Blobs and raw
> data
> > pointers.
> > * New layers, including SoftmaxLayer, ArgSortLayer, DummyLayer, RNN
> layers
> > and cuDNN layers.
> > * Update Layer class -- for carrying multiple data/grad Blobs.
> > * Extract features and test performance for new data by loading
> previously
> > trained model parameters.
> > * Add Store class for IO operations
> >
> > We look forward to hearing your feedbacks, suggestions, and contributions
> > to the project (http://singa.apache.org/develop/schedule.html).
> >
> > On behalf of the SINGA team,
> > Wei Wang
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message