Return-Path: X-Original-To: apmail-singa-dev-archive@minotaur.apache.org Delivered-To: apmail-singa-dev-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 9735A18BA3 for ; Sat, 11 Jul 2015 06:30:06 +0000 (UTC) Received: (qmail 38950 invoked by uid 500); 11 Jul 2015 06:30:06 -0000 Delivered-To: apmail-singa-dev-archive@singa.apache.org Received: (qmail 38927 invoked by uid 500); 11 Jul 2015 06:30:06 -0000 Mailing-List: contact dev-help@singa.incubator.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@singa.incubator.apache.org Delivered-To: mailing list dev@singa.incubator.apache.org Received: (qmail 38917 invoked by uid 99); 11 Jul 2015 06:30:06 -0000 Received: from Unknown (HELO spamd1-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 11 Jul 2015 06:30:06 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd1-us-west.apache.org (ASF Mail Server at spamd1-us-west.apache.org) with ESMTP id 1597BD3AB3 for ; Sat, 11 Jul 2015 06:30:06 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd1-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 1.209 X-Spam-Level: * X-Spam-Status: No, score=1.209 tagged_above=-999 required=6.31 tests=[KAM_ASCII_DIVIDERS=0.8, KAM_LAZY_DOMAIN_SECURITY=1, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, RP_MATCHES_RCVD=-0.571] autolearn=disabled Received: from mx1-us-west.apache.org ([10.40.0.8]) by localhost (spamd1-us-west.apache.org [10.40.0.7]) (amavisd-new, port 10024) with ESMTP id D2AwWKO9N_w2 for ; Sat, 11 Jul 2015 06:30:05 +0000 (UTC) Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by mx1-us-west.apache.org (ASF Mail Server at mx1-us-west.apache.org) with SMTP id 2D80520F46 for ; Sat, 11 Jul 2015 06:30:05 +0000 (UTC) Received: (qmail 38820 invoked by uid 99); 11 Jul 2015 06:30:05 -0000 Received: from arcas.apache.org (HELO arcas.apache.org) (140.211.11.28) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 11 Jul 2015 06:30:05 +0000 Date: Sat, 11 Jul 2015 06:30:05 +0000 (UTC) From: "wangwei (JIRA)" To: dev@singa.incubator.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Updated] (SINGA-29) Update NeuralNet class to enable customizing layer partition type MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 [ https://issues.apache.org/jira/browse/SINGA-29?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] wangwei updated SINGA-29: ------------------------- Description: This ticket is to update the NeuralNet class to enable users customize the partitioning of each layer. It also cleans the code for NeuralNet class and Graph class. The are two places where the user can configure the partitioning of the neural net. * partition_dim for the whole neural net (in NetProto) * partition_dim for each layer (in LayerProto) The partition_dim of the net will be copied to each layer if the layer's partition_dim is not set; Otherwise, the layer's own partition_dim will be used. Currently we support three values of partition_dim: * partition_dim = -1, no partition * partition_dim = 0, partition along the batch dimension, e.g., partition one mini-batch of 100 images into two partitions, each with 50 images. * partition_dim = 1, partition along feature dimension, e.g., if we partition one mini-batch of 100 images, each represented using 128-d feature vector, into two partitions. Each partition would have 100 images, each represented using 64-d feature vector. Simple partitioning schemes will be tested firstly, e.g., partition the whole net with partition_dim = 0. Complex partitioning scheme, e.g., hybrid partitioning (some layers are partitioned along dimension 0 and some layers are partitioned along dimension 1 or -1) will be tested later. Advanced partition like assigning some layers to one worker and other layers to another worker will be supported by customizing partition_id of LayerProto in the future. NeuralNet is constructed as follows: Neural net configuration is converted to a graph with one node per (sub) layer. Some connection nodes will be inserted automatically if the neural net needs partitioning (e.g., group size >1). After topology sort, one Layer will be created per node and layers will be connected accordingly. The Graph class provides functions for adding/removing nodes and edges, and sorting nodes in topology order. Each node stores the configuration of one layer. was: This ticket is to update the NeuralNet class to enable users customize the partitioning of each layer. It also cleans the code for NeuralNet class and Graph class. The are two places where the user can configure the partitioning of the neural net. * partition_dim for the whole neural net (in NetProto) * partition_dim for each layer (in LayerProto) The partition_dim of the net will be copied to each layer if the layer's partition_dim is not set; Otherwise, the layer's own partition_dim will be used. Currently we support three values of partition_dim: * partition_dim = -1, no partition * partition_dim = 0, partition along the batch dimension, e.g., partition one mini-batch of 100 images into two partitions, each with 50 images. * partition_dim = 1, partition along feature dimension, e.g., if we partition one mini-batch of 100 images, each represented using 128-d feature vector, into two partitions. Each partition would have 100 images, each represented using 64-d feature vector. NeuralNet is constructed as follows: Neural net configuration is converted to a graph with one node per (sub) layer. Some connection nodes will be inserted automatically if the neural net needs partitioning (e.g., group size >1). After topology sort, one Layer will be created per node and layers will be connected accordingly. The Graph class provides functions for adding/removing nodes and edges, and sorting nodes in topology order. Each node stores the configuration of one layer. > Update NeuralNet class to enable customizing layer partition type > ----------------------------------------------------------------- > > Key: SINGA-29 > URL: https://issues.apache.org/jira/browse/SINGA-29 > Project: Singa > Issue Type: Bug > Reporter: wangwei > Assignee: wangwei > > This ticket is to update the NeuralNet class to enable users customize the partitioning of each layer. It also cleans the code for NeuralNet class and Graph class. > The are two places where the user can configure the partitioning of the neural net. > * partition_dim for the whole neural net (in NetProto) > * partition_dim for each layer (in LayerProto) > The partition_dim of the net will be copied to each layer if the layer's partition_dim is not set; Otherwise, the layer's own partition_dim will be used. > Currently we support three values of partition_dim: > * partition_dim = -1, no partition > * partition_dim = 0, partition along the batch dimension, e.g., partition one mini-batch of 100 images into two partitions, each with 50 images. > * partition_dim = 1, partition along feature dimension, e.g., if we partition one mini-batch of 100 images, each represented using 128-d feature vector, into two partitions. Each partition would have 100 images, each represented using 64-d feature vector. > Simple partitioning schemes will be tested firstly, e.g., partition the whole net with partition_dim = 0. Complex partitioning scheme, e.g., hybrid partitioning (some layers are partitioned along dimension 0 and some layers are partitioned along dimension 1 or -1) will be tested later. Advanced partition like assigning some layers to one worker and other layers to another worker will be supported by customizing partition_id of LayerProto in the future. > NeuralNet is constructed as follows: > Neural net configuration is converted to a graph with one node per (sub) layer. Some connection nodes will be inserted automatically if the neural net needs partitioning (e.g., group size >1). After topology sort, one Layer will be created per node and layers will be connected accordingly. The Graph class provides functions for adding/removing nodes and edges, and sorting nodes in topology order. Each node stores the configuration of one layer. -- This message was sent by Atlassian JIRA (v6.3.4#6332)