singa-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "wangwei (JIRA)" <>
Subject [jira] [Created] (SINGA-100) Implement layers using CUDNN for GPU training
Date Mon, 02 Nov 2015 13:24:27 GMT
wangwei created SINGA-100:

             Summary: Implement layers using CUDNN for GPU training
                 Key: SINGA-100
             Project: Singa
          Issue Type: New Feature
            Reporter: wangwei

NVIDIA has released the cudnn library optimized for CNN operations like convolution, pooling,
etc. It has achieved overall good performance. Hence, it is essential to add cudnn supported
layers in SINGA for efficient GPU training (SINGA-41).

We will use the cudnn library to implement CNN layers, namely,
 cudnnConvolutionLayer, cudnnPoolingLayer, cudnnLRNLayer, cudnnSoftmaxLayer, cudnnReLULayer,
cudnnSigmoidLayer, cudnnTanhLayer, cudnnDivNormLayer.

Data type float-16 will not be consider in this ticket.

This message was sent by Atlassian JIRA

View raw message