From dev-return-4067-archive-asf-public=cust-asf.ponee.io@singa.apache.org Thu Jan 30 13:44:35 2020 Return-Path: X-Original-To: archive-asf-public@cust-asf.ponee.io Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [207.244.88.153]) by mx-eu-01.ponee.io (Postfix) with SMTP id 1F34B18062B for ; Thu, 30 Jan 2020 14:44:35 +0100 (CET) Received: (qmail 32682 invoked by uid 500); 30 Jan 2020 13:44:34 -0000 Mailing-List: contact dev-help@singa.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@singa.apache.org Delivered-To: mailing list dev@singa.apache.org Received: (qmail 32672 invoked by uid 99); 30 Jan 2020 13:44:34 -0000 Received: from ec2-52-202-80-70.compute-1.amazonaws.com (HELO gitbox.apache.org) (52.202.80.70) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 30 Jan 2020 13:44:34 +0000 From: GitBox To: dev@singa.apache.org Subject: [GitHub] [singa] nudles commented on a change in pull request #586: Add GlobalAVGPool operator for autograd and onnx Message-ID: <158039187444.23996.4952258594767695062.gitbox@gitbox.apache.org> References: In-Reply-To: Date: Thu, 30 Jan 2020 13:44:34 -0000 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 8bit nudles commented on a change in pull request #586: Add GlobalAVGPool operator for autograd and onnx URL: https://github.com/apache/singa/pull/586#discussion_r372953025 ########## File path: python/singa/autograd.py ########## @@ -2760,3 +2760,42 @@ def backward(self, dy): def reciprocal(x): return Reciprocal()(x)[0] + + +class GlobalAveragePool(Operation): + def __init__(self, data_format='channels_first'): + super(GlobalAveragePool, self).__init__() + self.data_format = data_format + + def forward(self, x): + if training: + self.mask = singa.Tensor(list(x.shape()), x.device()) + + shape = list(x.shape()) + + # (N x C x H x W) for channels_first + if self.data_format == 'channels_first': + axes = tuple(i for i in range(2, len(shape))) + self.shape_divisor = 1/np.prod(shape[2:]) + else: # (N x H x W x C) for channels_last + axes = tuple(i for i in range(1, len(shape)-1)) + self.shape_divisor = 1/np.prod(shape[1:-1]) + + # output shape + # (N x C x 1 x 1) for channels_first + # (N x 1 x 1 x C) for channels_last + for i in axes: + shape[i] = 1 + + x = tensor.from_raw_tensor(x) + x = tensor.sum(x, axis=axes) Review comment: the raw tensor should also have the sum function? ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: users@infra.apache.org With regards, Apache Git Services