singa-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From GitBox <...@apache.org>
Subject [GitHub] [singa] chrishkchris commented on issue #674: Autograd Layer constructor
Date Sun, 12 Apr 2020 03:33:54 GMT
chrishkchris commented on issue #674: Autograd Layer constructor
URL: https://github.com/apache/singa/issues/674#issuecomment-612558533
 
 
   > we may need more discussion before the implementation, e.g., which option to go?
   
   My personal opinion is: 
   1. keep the constructors of existing layer classes and examples unchanged (otherwise may
not be backward compatible to current examples/APIs and may need lot of debug and finding
errors of current examples, it is near the release so better not take the risk)
   2. for the new RNN PR the tensor size is infered from the input, and this initialzation
statement should put inside execute_once(fn, dev), then there is no problem
   3. If we really want to support "get_params()" when we use RNN function, we may make use
of module class to buffer the ops, then there won't be any actual run. In this case, all the
parameters size will be obtained after the forward function and so we can use"get_params()"
afterward. Say if we don't want to use graph after getting the parameter size, we can use
ResetGraph to clear the buffer, then turn off buffer and run without graph. (In this setting,
layer class needs to have module class buffering, so need to be a subclass of module)
   
   @XJDKC @dcslin @joddiy may you please give your suggestions as well

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

Mime
View raw message