systemml-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Mike Dusenberry (JIRA)" <>
Subject [jira] [Commented] (SYSTEMML-2018) Fixing Weight Decay Regularization in ADAM
Date Mon, 20 Nov 2017 19:04:00 GMT


Mike Dusenberry commented on SYSTEMML-2018:

[~return_01] Yeah that would be interesting to explore further.  I would suggest creating
a new optimizer called {{AdamW}} as is done in the paper, rather than altering the current
Adam one. 

> Fixing Weight Decay Regularization in ADAM
> ------------------------------------------
>                 Key: SYSTEMML-2018
>                 URL:
>             Project: SystemML
>          Issue Type: Improvement
>          Components: Algorithms
>            Reporter: Janardhan
> The common implementations of adaptive gradient algorithms, such
> as Adam, limit the potential benefit of weight decay regularization, because the
> weights do not decay multiplicatively (as would be expected for standard weight
> decay) but by an additive constant factor.
> This following paper found a way to fix regularization in Adam Optimization with one
addition step(+ wx) to the gradient step :

This message was sent by Atlassian JIRA

View raw message