tvm-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From masahi <notificati...@github.com>
Subject Re: [dmlc/tvm] [RFC][DISCUSS] Tuple-related Fusion (#3039)
Date Thu, 18 Apr 2019 07:09:23 GMT
@zhiics It looks like the tuple with duplicated tensors is only problematic if it is the return
value of a subfunction (i.e. a function that is lowered to topi and compiled by TVM). If we
lift the tuple out of a subfunction and put it under the global function, it seems to work
fine. The test below works on my local.

```
import tvm
from tvm import relay

data = relay.var("data", relay.ty.TensorType((1, 32, 32, 3), "float32"))
log = relay.log(data)
func = relay.Function([data],  relay.Tuple(tvm.convert([log, log])))
func = relay.ir_pass.infer_type(func)
with relay.build_config(opt_level=3):
    graph, lib, params = relay.build(func, target="llvm")
```

The tuple is now lifted out of subfunction %0.
```
fn (%data: Tensor[(1, 32, 32, 3), float32]) -> (Tensor[(1, 32, 32, 3), float32], Tensor[(1,
32, 32, 3), float32]) {
  %0 = fn (%p0: Tensor[(1, 32, 32, 3), float32], __dict__=meta[StrMap][0]) -> Tensor[(1,
32, 32, 3), float32] {
    log(%p0)
  }
  %1 = %0(%data)
  (%1, %1)
}
```

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/dmlc/tvm/issues/3039#issuecomment-484382985
Mime
  • Unnamed multipart/alternative (inline, 7-Bit, 0 bytes)
View raw message