singa-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From GitBox <>
Subject [GitHub] [singa] chrishkchris commented on issue #674: Autograd Layer constructor
Date Sun, 12 Apr 2020 01:47:20 GMT
chrishkchris commented on issue #674: Autograd Layer constructor
   > > > I think it's not an issue. When we use the computational graph, initialization
operations won't be buffered since we just need to execute them once. For these operations,
I just execute them immediately instead of buffering them into the graph at present. So before
running the entire graph, all parameter tensors will be initialized.
   > > 
   > > 
   > > Yes, if the initization is in the init function, it will be not buffered automatically.
If the initialization is in call function, we can still add a few lines to turn off the buffering.
In both case, the graph function won't be affected.
   > how to turn it off?
   To turn off just need to add three lines: 
   1. Before the statament before you want to execute, add two lines:
   flag = param.device.graph_enabled()
   2. After the statement, add one line
   Note that param is any input tensor that has the attribute "device" for us to use

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:

With regards,
Apache Git Services

View raw message