mxnet-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From GitBox <...@apache.org>
Subject [GitHub] [incubator-mxnet] apeforest edited a comment on issue #15120: [bug] fix higher grad log
Date Wed, 05 Jun 2019 20:33:13 GMT
apeforest edited a comment on issue #15120: [bug] fix higher grad log 
URL: https://github.com/apache/incubator-mxnet/pull/15120#issuecomment-499170409
 
 
   @kshitij12345 I have some question about the equation `expected_head_grad = (grad_op(x)
* head_grad_grads).asnumpy()` in your test.
   
   My understanding from the chain rule:
   
   ```
   Given y =f(x)
   dL/dx = dL/dy * dy/dx -->  this is the first backward pass. Let dL/dy be y_grad, we
get dL/dx (noted as x_grad)
   
   Now we rewrite the above the equation:
   
   input0: y_grad
   input1: x
   output: x_grad = y_grad * f'(x)
   
   Another backward pass for this would be:
   dL/d y_grad = dL/d x_grad * f'(x)
   dL/dx = dL/d x_grad * y_grad * f''(x)
   ```
   What is the meaning of dL/d y_grad? Are we treating y_grad as another input variable here?

   
   Many thanks for your clarification.
   
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

Mime
View raw message