singa-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "wangwei (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (SINGA-29) Update NeuralNet class to enable customizing layer partition type
Date Sat, 11 Jul 2015 06:30:05 GMT

     [ https://issues.apache.org/jira/browse/SINGA-29?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

wangwei updated SINGA-29:
-------------------------
    Description: 
This ticket is to update the NeuralNet class to enable users customize the partitioning of
each layer. It also cleans the code for NeuralNet class and Graph class.

The are two places where the user can configure the partitioning of the neural net. 
* partition_dim for the whole neural net (in NetProto)
* partition_dim for each layer (in LayerProto)
The partition_dim of the net will be copied to each layer if the layer's partition_dim is
not set; Otherwise, the layer's own partition_dim will be used.

Currently we support three values of partition_dim:
* partition_dim = -1, no partition
* partition_dim = 0, partition along the batch dimension, e.g., partition one mini-batch of
100 images into two partitions, each with 50 images.
* partition_dim = 1, partition along feature dimension, e.g., if we partition one mini-batch
of 100 images, each represented using 128-d feature vector, into two partitions. Each partition
would have 100 images, each represented using 64-d feature vector.

Simple partitioning schemes will be tested firstly, e.g., partition the whole net with partition_dim
= 0. Complex partitioning scheme, e.g., hybrid partitioning (some layers are partitioned along
dimension 0 and some layers are partitioned along dimension 1 or -1) will be tested later.
Advanced partition like assigning some layers to one worker and other layers to another worker
will be supported by customizing partition_id of LayerProto in the future.

NeuralNet is constructed as follows:
Neural net configuration is converted to a graph with one node per (sub) layer. Some connection
nodes will be inserted automatically if the neural net needs partitioning (e.g., group size
>1). After topology sort, one Layer will be created per node and layers will be connected
accordingly. The Graph class provides functions for adding/removing nodes and edges, and sorting
nodes in topology order. Each node stores the configuration of one layer.

  was:
This ticket is to update the NeuralNet class to enable users customize the partitioning of
each layer. It also cleans the code for NeuralNet class and Graph class.

The are two places where the user can configure the partitioning of the neural net. 
* partition_dim for the whole neural net (in NetProto)
* partition_dim for each layer (in LayerProto)
The partition_dim of the net will be copied to each layer if the layer's partition_dim is
not set; Otherwise, the layer's own partition_dim will be used.

Currently we support three values of partition_dim:
* partition_dim = -1, no partition
* partition_dim = 0, partition along the batch dimension, e.g., partition one mini-batch of
100 images into two partitions, each with 50 images.
* partition_dim = 1, partition along feature dimension, e.g., if we partition one mini-batch
of 100 images, each represented using 128-d feature vector, into two partitions. Each partition
would have 100 images, each represented using 64-d feature vector.

NeuralNet is constructed as follows:
Neural net configuration is converted to a graph with one node per (sub) layer. Some connection
nodes will be inserted automatically if the neural net needs partitioning (e.g., group size
>1). After topology sort, one Layer will be created per node and layers will be connected
accordingly. The Graph class provides functions for adding/removing nodes and edges, and sorting
nodes in topology order. Each node stores the configuration of one layer.


> Update NeuralNet class to enable customizing layer partition type
> -----------------------------------------------------------------
>
>                 Key: SINGA-29
>                 URL: https://issues.apache.org/jira/browse/SINGA-29
>             Project: Singa
>          Issue Type: Bug
>            Reporter: wangwei
>            Assignee: wangwei
>
> This ticket is to update the NeuralNet class to enable users customize the partitioning
of each layer. It also cleans the code for NeuralNet class and Graph class.
> The are two places where the user can configure the partitioning of the neural net. 
> * partition_dim for the whole neural net (in NetProto)
> * partition_dim for each layer (in LayerProto)
> The partition_dim of the net will be copied to each layer if the layer's partition_dim
is not set; Otherwise, the layer's own partition_dim will be used.
> Currently we support three values of partition_dim:
> * partition_dim = -1, no partition
> * partition_dim = 0, partition along the batch dimension, e.g., partition one mini-batch
of 100 images into two partitions, each with 50 images.
> * partition_dim = 1, partition along feature dimension, e.g., if we partition one mini-batch
of 100 images, each represented using 128-d feature vector, into two partitions. Each partition
would have 100 images, each represented using 64-d feature vector.
> Simple partitioning schemes will be tested firstly, e.g., partition the whole net with
partition_dim = 0. Complex partitioning scheme, e.g., hybrid partitioning (some layers are
partitioned along dimension 0 and some layers are partitioned along dimension 1 or -1) will
be tested later. Advanced partition like assigning some layers to one worker and other layers
to another worker will be supported by customizing partition_id of LayerProto in the future.
> NeuralNet is constructed as follows:
> Neural net configuration is converted to a graph with one node per (sub) layer. Some
connection nodes will be inserted automatically if the neural net needs partitioning (e.g.,
group size >1). After topology sort, one Layer will be created per node and layers will
be connected accordingly. The Graph class provides functions for adding/removing nodes and
edges, and sorting nodes in topology order. Each node stores the configuration of one layer.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message