singa-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "ASF subversion and git services (JIRA)" <>
Subject [jira] [Commented] (SINGA-180) Add Activation layer and Softmax layer
Date Fri, 03 Jun 2016 07:48:59 GMT


ASF subversion and git services commented on SINGA-180:

Commit 3e2507b7af8c4fe3746f3156f29eba99a30e546f in incubator-singa's branch refs/heads/dev
from jixin
[;h=3e2507b ]

SINGA-180 Add Activation layer and Softmax layer

Add cpu and cudnn implementation for activation and softmax layer.

Note: activation layer currently support sigmoid/tanh function and relu forward computation.

Remove tensor softmax function. Instead, use tensor op(*) and function(Sum) to impletment
softmax function.

Add test files for activation and softmax layer.

Add Element-wise implementation for activation functions (relu/tanh/sigmoid).

Add tensor scaler comparison function (<, <=, >, >=), i.e., to compare a tensor
with a constant.

Add implementation for tensor math functions (exp, log, pow).

Add functions for matrix op vector, where op is multiply and div.

Pass all tests.

> Add Activation layer and Softmax layer
> --------------------------------------
>                 Key: SINGA-180
>                 URL:
>             Project: Singa
>          Issue Type: New Feature
>            Reporter: Xin Ji
> Activation and Softmax layer are implemented using Tensor math functions.
> CudnnActivation are implemented using both cudnn 4 and cudnn 5.
> CudnnSoftmax layer are implemented using cudnn 5, the same apis as cudnn 4.
> Test files are added for testing the correctness of the above four layers.

This message was sent by Atlassian JIRA

View raw message