singa-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From GitBox <...@apache.org>
Subject [GitHub] [singa] dcslin commented on issue #707: Layer mismatch causes session to to terminate abruptly
Date Tue, 02 Jun 2020 05:31:48 GMT

dcslin commented on issue #707:
URL: https://github.com/apache/singa/issues/707#issuecomment-637286868


   Hi @Shashankwer , Understand that the error code is not clear enough, but I could not replicate
the error without further details(inputs, outputs), would you like to refer to following working
example transformed from your code to help you debugging?
   
   ```
   from singa import autograd
   from singa import module
   from singa import opt
   from singa import tensor
   
   class MLP():
       def __init__(self):
           self.linear1 = autograd.Linear(3,4)
           self.linear2 = autograd.Linear(4,3)
       def forward(self,x):
           y = self.linear1(x)
           return self.linear2(y)
       def loss(self, out, ty):
           return autograd.softmax_cross_entropy(out, ty)
       def optim(self, loss):
           self.optimizer.backward_and_update(loss)
       def set_optimizer(self, optimizer):
           self.optimizer = optimizer
   
   
   if __name__ == '__main__':
       x=tensor.Tensor((3,3)).gaussian(1,1)
       y=tensor.Tensor((3,3)).gaussian(1,1)
   
       autograd.training = True
       m = MLP()
       sgd = opt.SGD()
       m.set_optimizer(sgd)
       out = m.forward(x)
       loss = m.loss(out, y)
       m.optim(loss)
       print(loss)
   ```
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



Mime
View raw message