tvm-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Animesh Jain <notificati...@github.com>
Subject Re: [dmlc/tvm] [RFC][Quantization] Support quantized models from TensorflowLite (#2351)
Date Wed, 29 May 2019 15:53:21 GMT
> Yes, I believe the MobilenetV2 relu_6 is effectively fused in by the downscale saturation.
You might need it if you want to support their way of training, though.
> 
> Yes Mobilenet has the q_add, but I suggest the Inceptionv3 for q_concatenate, since it
also has concat nodes feeding into concat nodes, and tflite also has to rescale inputs inside
the concat operations.

Make sense. For now, I was thinking of not worrying about depth-wise conv. So, decided to
take Inception V3 into account. I think given we are in the starting position, I don't have
any big inclination towards any network. My motive is to focus on getting the right infrastructure
in the start and showcase it with one large network. The performance micro-optimizations can
then phased.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/dmlc/tvm/issues/2351#issuecomment-496996627
Mime
  • Unnamed multipart/alternative (inline, 7-Bit, 0 bytes)
View raw message