singa-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "ASF subversion and git services (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SINGA-145) New SGD based optimization Updaters: AdaDelta, Adam, AdamMax
Date Wed, 24 Feb 2016 09:26:18 GMT

    [ https://issues.apache.org/jira/browse/SINGA-145?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15162721#comment-15162721
] 

ASF subversion and git services commented on SINGA-145:
-------------------------------------------------------

Commit e32e70ccdd16582c633f12e3e6702636139a6078 in incubator-singa's branch refs/heads/master
from ijingo
[ https://git-wip-us.apache.org/repos/asf?p=incubator-singa.git;h=e32e70c ]

SINGA-145 New SGD based optimization Updaters: AdaDelta, Adam, AdamMax

New Updaters: AdaDelta, Adam, AdamMax.
To implement AdamMax, add two new operators for Tensor in cxxnet_op.h i.e. op::abs and op::max.


> New SGD based optimization Updaters: AdaDelta, Adam, AdamMax
> ------------------------------------------------------------
>
>                 Key: SINGA-145
>                 URL: https://issues.apache.org/jira/browse/SINGA-145
>             Project: Singa
>          Issue Type: New Feature
>         Environment: Universal
>            Reporter: Wang Ji
>            Priority: Minor
>
> This ticket implements three Stochastic Gradient Descent (SGD) based optimization algorithms
in Updater.cc, i.e. AdaDelta, Adaptive Moment Estimation(Adam) with its variant AdamMax .These
algorithms adapt the learning rate to the parameters, performing larger updates for infrequent
and smaller updates for frequent parameters.
> For algorithm details, refer to AdaDelata (http://arxiv.org/abs/1212.5701), Adam (http://arxiv.org/pdf/1412.6980.pdf).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message