systemml-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Janardhan (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (SYSTEMML-2018) Fixing Weight Decay Regularization in ADAM
Date Thu, 16 Nov 2017 08:20:00 GMT

     [ https://issues.apache.org/jira/browse/SYSTEMML-2018?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Janardhan updated SYSTEMML-2018:
--------------------------------
    Summary: Fixing Weight Decay Regularization in ADAM  (was: FIXING WEIGHT DECAY REGULARIZATION
IN ADAM)

> Fixing Weight Decay Regularization in ADAM
> ------------------------------------------
>
>                 Key: SYSTEMML-2018
>                 URL: https://issues.apache.org/jira/browse/SYSTEMML-2018
>             Project: SystemML
>          Issue Type: Improvement
>          Components: Algorithms
>            Reporter: Janardhan
>
> The common implementations of adaptive gradient algorithms, such
> as Adam, limit the potential benefit of weight decay regularization, because the
> weights do not decay multiplicatively (as would be expected for standard weight
> decay) but by an additive constant factor.
> This following paper found a way to fix regularization in Adam Optimization with one
addition step(+ wx) to the gradient step :
> https://arxiv.org/pdf/1711.05101.pdf



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Mime
View raw message