mxnet-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From GitBox <...@apache.org>
Subject [GitHub] [incubator-mxnet] IvyGongoogle edited a comment on issue #14159: [Feature Request] Support fp16 for c++ api
Date Thu, 25 Apr 2019 03:34:56 GMT
IvyGongoogle edited a comment on issue #14159: [Feature Request] Support fp16 for c++ api
URL: https://github.com/apache/incubator-mxnet/issues/14159#issuecomment-485865319
 
 
   > @IvyGongoogle Are your inputs and weights in fp16. Your change should work to run
fp16 inference. What batch size are you using and what is the model ? For smaller batch sizes
you may not see a big speedup. Also what hardware are you running it on ?
   
   
   @anirudh2290 @KellenSunderland  Sorry, my test results show that when using fp16, the speed
is twice faster then with the fp32 when inference a resentv1-50 cnn model, but not works with
ocr recognition model. I have updated the above comment.     But if you have experiences with
ocr inference using mxnet fp16, please give me some advises.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

Mime
View raw message