singa-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From GitBox <...@apache.org>
Subject [GitHub] [singa] Shashankwer opened a new issue #707: Layer mismatch causes session to to terminate abruptly
Date Wed, 20 May 2020 09:52:37 GMT

Shashankwer opened a new issue #707:
URL: https://github.com/apache/singa/issues/707


   Hi, 
   
   The issue might be known, however while creating a Neural network layer stack with unmatched
layer can cause the current python session to end abruptly, without generating any stack trace.
while calculating model.loss(e.g. autograd.mse_loss(y,t) ) 
   
   for example of a simple feed forward neural network:
   class MLP():
       def __init__(self):
           self.linear1 = autograd.Linear(3,4)
           self.linear2 = autograd.Linear(4,3)
       def forward(self,x):
           y = self.linear1(x)
           return self.linear2(x) 
   
   if the output does not have a dimension of 3, the current session will terminate without
generating any error. 
   
   A stack trace is generated with below warning. 
   
   WARNING: Logging before InitGoogleLogging() is written to STDERR
   F0520 17:37:19.265754 288538048 tensor.cc:431] Check failed: shape_.at(m - i) == 1 (3 vs.
1) i= 0
   *** Check failure stack trace: ***
   
   This causes to rerun the entire program/notebook again. The same issue is not seen in autograd.backward
which generates an assertion error. 
   
   Thanks and Regards,
   Shashank 
   
   
   
   
   
     


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



Mime
View raw message