Return-Path: X-Original-To: apmail-singa-commits-archive@minotaur.apache.org Delivered-To: apmail-singa-commits-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id C88AA185A6 for ; Fri, 7 Aug 2015 09:11:50 +0000 (UTC) Received: (qmail 71413 invoked by uid 500); 7 Aug 2015 09:11:41 -0000 Delivered-To: apmail-singa-commits-archive@singa.apache.org Received: (qmail 71396 invoked by uid 500); 7 Aug 2015 09:11:41 -0000 Mailing-List: contact commits-help@singa.incubator.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@singa.incubator.apache.org Delivered-To: mailing list commits@singa.incubator.apache.org Received: (qmail 71387 invoked by uid 99); 7 Aug 2015 09:11:41 -0000 Received: from Unknown (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 07 Aug 2015 09:11:41 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id BADB719DCA9 for ; Fri, 7 Aug 2015 09:11:40 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 1.791 X-Spam-Level: * X-Spam-Status: No, score=1.791 tagged_above=-999 required=6.31 tests=[KAM_ASCII_DIVIDERS=0.8, KAM_LAZY_DOMAIN_SECURITY=1, T_RP_MATCHES_RCVD=-0.01, URIBL_BLOCKED=0.001] autolearn=disabled Received: from mx1-us-west.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id b6X8eHF9JQ2M for ; Fri, 7 Aug 2015 09:11:26 +0000 (UTC) Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by mx1-us-west.apache.org (ASF Mail Server at mx1-us-west.apache.org) with SMTP id 62BCB20F6D for ; Fri, 7 Aug 2015 09:11:26 +0000 (UTC) Received: (qmail 71131 invoked by uid 99); 7 Aug 2015 09:11:26 -0000 Received: from eris.apache.org (HELO hades.apache.org) (140.211.11.105) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 07 Aug 2015 09:11:26 +0000 Received: from hades.apache.org (localhost [127.0.0.1]) by hades.apache.org (ASF Mail Server at hades.apache.org) with ESMTP id 153ACAC0335 for ; Fri, 7 Aug 2015 09:11:26 +0000 (UTC) Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit Subject: svn commit: r961040 - in /websites/staging/singa/trunk/content: ./ community/ develop/ docs/ Date: Fri, 07 Aug 2015 09:11:25 -0000 To: commits@singa.incubator.apache.org From: buildbot@apache.org X-Mailer: svnmailer-1.0.9 Message-Id: <20150807091126.153ACAC0335@hades.apache.org> Author: buildbot Date: Fri Aug 7 09:11:25 2015 New Revision: 961040 Log: Staging update by buildbot for singa Modified: websites/staging/singa/trunk/content/ (props changed) websites/staging/singa/trunk/content/community.html websites/staging/singa/trunk/content/community/issue-tracking.html websites/staging/singa/trunk/content/community/mail-lists.html websites/staging/singa/trunk/content/community/source-repository.html websites/staging/singa/trunk/content/community/team-list.html websites/staging/singa/trunk/content/develop/contribute-code.html websites/staging/singa/trunk/content/develop/contribute-docs.html websites/staging/singa/trunk/content/develop/how-contribute.html websites/staging/singa/trunk/content/develop/schedule.html websites/staging/singa/trunk/content/docs.html websites/staging/singa/trunk/content/docs/architecture.html websites/staging/singa/trunk/content/docs/checkpoint.html websites/staging/singa/trunk/content/docs/cnn.html websites/staging/singa/trunk/content/docs/code-structure.html websites/staging/singa/trunk/content/docs/communication.html websites/staging/singa/trunk/content/docs/data.html websites/staging/singa/trunk/content/docs/datashard.html websites/staging/singa/trunk/content/docs/debug.html websites/staging/singa/trunk/content/docs/examples.html websites/staging/singa/trunk/content/docs/hdfs.html websites/staging/singa/trunk/content/docs/installation.html websites/staging/singa/trunk/content/docs/layer.html websites/staging/singa/trunk/content/docs/lmdb.html websites/staging/singa/trunk/content/docs/mlp.html websites/staging/singa/trunk/content/docs/model-config.html websites/staging/singa/trunk/content/docs/neuralnet-partition.html websites/staging/singa/trunk/content/docs/neuralnet.html websites/staging/singa/trunk/content/docs/programmer-guide.html websites/staging/singa/trunk/content/docs/rbm.html websites/staging/singa/trunk/content/docs/rnn.html websites/staging/singa/trunk/content/index.html websites/staging/singa/trunk/content/introduction.html websites/staging/singa/trunk/content/quick-start.html Propchange: websites/staging/singa/trunk/content/ ------------------------------------------------------------------------------ --- cms:source-revision (original) +++ cms:source-revision Fri Aug 7 09:11:25 2015 @@ -1 +1 @@ -1693520 +1694636 Modified: websites/staging/singa/trunk/content/community.html ============================================================================== --- websites/staging/singa/trunk/content/community.html (original) +++ websites/staging/singa/trunk/content/community.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Community Modified: websites/staging/singa/trunk/content/community/issue-tracking.html ============================================================================== --- websites/staging/singa/trunk/content/community/issue-tracking.html (original) +++ websites/staging/singa/trunk/content/community/issue-tracking.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Issue Tracking Modified: websites/staging/singa/trunk/content/community/mail-lists.html ============================================================================== --- websites/staging/singa/trunk/content/community/mail-lists.html (original) +++ websites/staging/singa/trunk/content/community/mail-lists.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Project Mailing Lists Modified: websites/staging/singa/trunk/content/community/source-repository.html ============================================================================== --- websites/staging/singa/trunk/content/community/source-repository.html (original) +++ websites/staging/singa/trunk/content/community/source-repository.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Source Repository Modified: websites/staging/singa/trunk/content/community/team-list.html ============================================================================== --- websites/staging/singa/trunk/content/community/team-list.html (original) +++ websites/staging/singa/trunk/content/community/team-list.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – The SINGA Team Modified: websites/staging/singa/trunk/content/develop/contribute-code.html ============================================================================== --- websites/staging/singa/trunk/content/develop/contribute-code.html (original) +++ websites/staging/singa/trunk/content/develop/contribute-code.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – How to Contribute Code Modified: websites/staging/singa/trunk/content/develop/contribute-docs.html ============================================================================== --- websites/staging/singa/trunk/content/develop/contribute-docs.html (original) +++ websites/staging/singa/trunk/content/develop/contribute-docs.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – How to Contribute Documentation Modified: websites/staging/singa/trunk/content/develop/how-contribute.html ============================================================================== --- websites/staging/singa/trunk/content/develop/how-contribute.html (original) +++ websites/staging/singa/trunk/content/develop/how-contribute.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – How to Contribute to SINGA Modified: websites/staging/singa/trunk/content/develop/schedule.html ============================================================================== --- websites/staging/singa/trunk/content/develop/schedule.html (original) +++ websites/staging/singa/trunk/content/develop/schedule.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Development Schedule Modified: websites/staging/singa/trunk/content/docs.html ============================================================================== --- websites/staging/singa/trunk/content/docs.html (original) +++ websites/staging/singa/trunk/content/docs.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Documentation Modified: websites/staging/singa/trunk/content/docs/architecture.html ============================================================================== --- websites/staging/singa/trunk/content/docs/architecture.html (original) +++ websites/staging/singa/trunk/content/docs/architecture.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – System Architecture Modified: websites/staging/singa/trunk/content/docs/checkpoint.html ============================================================================== --- websites/staging/singa/trunk/content/docs/checkpoint.html (original) +++ websites/staging/singa/trunk/content/docs/checkpoint.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Checkpoint and Resume Modified: websites/staging/singa/trunk/content/docs/cnn.html ============================================================================== --- websites/staging/singa/trunk/content/docs/cnn.html (original) +++ websites/staging/singa/trunk/content/docs/cnn.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Modified: websites/staging/singa/trunk/content/docs/code-structure.html ============================================================================== --- websites/staging/singa/trunk/content/docs/code-structure.html (original) +++ websites/staging/singa/trunk/content/docs/code-structure.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Code Structure Modified: websites/staging/singa/trunk/content/docs/communication.html ============================================================================== --- websites/staging/singa/trunk/content/docs/communication.html (original) +++ websites/staging/singa/trunk/content/docs/communication.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Communication Modified: websites/staging/singa/trunk/content/docs/data.html ============================================================================== --- websites/staging/singa/trunk/content/docs/data.html (original) +++ websites/staging/singa/trunk/content/docs/data.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Data Preparation Modified: websites/staging/singa/trunk/content/docs/datashard.html ============================================================================== --- websites/staging/singa/trunk/content/docs/datashard.html (original) +++ websites/staging/singa/trunk/content/docs/datashard.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Modified: websites/staging/singa/trunk/content/docs/debug.html ============================================================================== --- websites/staging/singa/trunk/content/docs/debug.html (original) +++ websites/staging/singa/trunk/content/docs/debug.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – How to Debug Modified: websites/staging/singa/trunk/content/docs/examples.html ============================================================================== --- websites/staging/singa/trunk/content/docs/examples.html (original) +++ websites/staging/singa/trunk/content/docs/examples.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Example Models Modified: websites/staging/singa/trunk/content/docs/hdfs.html ============================================================================== --- websites/staging/singa/trunk/content/docs/hdfs.html (original) +++ websites/staging/singa/trunk/content/docs/hdfs.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Modified: websites/staging/singa/trunk/content/docs/installation.html ============================================================================== --- websites/staging/singa/trunk/content/docs/installation.html (original) +++ websites/staging/singa/trunk/content/docs/installation.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Installation Modified: websites/staging/singa/trunk/content/docs/layer.html ============================================================================== --- websites/staging/singa/trunk/content/docs/layer.html (original) +++ websites/staging/singa/trunk/content/docs/layer.html Fri Aug 7 09:11:25 2015 @@ -1,15 +1,15 @@ - + - Apache SINGA – + Apache SINGA – Layers Instruction @@ -183,7 +183,7 @@ Apache SINGA / -
  • +
  • Layers Instruction
  • @@ -401,7 +401,331 @@
    - +

    Layers Instruction

    +
    +
    +

    ShardData Layer

    +

    ShardData layer is used to read data from disk etc.

    + +
    +
    layer   
    +{
    +    name:"data"
    +    type:"kShardData"
    +    data_param
    +    {
    +       path:"Shard_File_Path"
    +       batchsize:int
    +    }
    +    exclude:kTrain|kValidation|kTest|kPositive|kNegative
    +}
    +
    +
    +

    Label Layer

    +

    Label layer is used to extract the label information from training data. The label information will be used in the loss layer to calculate the gradient.

    + +
    +
    layer
    +{
    +    name:"label"
    +    type:"kLabel"
    +    srclayers:"data"
    +}
    +
    +
    +

    Convolution Layer

    +

    Convolution layer is a basic layer used in constitutional neural net. It is used to extract local feature following some local patterns from slide windows in the image.

    + +
    +
    layer
    +{
    +    name:"Conv_Number"
    +    type:"kConvolution"
    +    srclayers:"Src_Layer_Name"
    +    convolution_param
    +    {
    +       num_filters:int
    +       //the count of the applied filters
    +       kernel:int
    +       //convolution kernel size
    +       stride:int
    +       //the distance between the successive filters
    +       pad:int
    +       //pad the images with a given int number of pixels border of zeros
    +    }
    +    param
    +    {
    +       name:"weight"
    +       init_method:kGaussian|kConstant:kUniform|kPretrained|kGaussianSqrtFanIn|kUniformSqrtFanIn|kUniformSqrtFanInOut
    +       /*use specific param of each init methods*/
    +       learning_rate_multiplier:float
    +    }
    +    param
    +    {
    +       name:"bias"
    +       init_method:kConstant|kGaussian|kUniform|kPretrained|kGaussianSqrtFanIn|kUniformSqrtFanIn|kUniformSqrtFanInOut
    +       /**use specific param of each init methods**/
    +       learning_rate_multiplier:float
    +    }
    +    //kGaussian: sample Gaussian with std and mean
    +    //kUniform: uniform sampling between low and high
    +    //kPretrained: from Toronto Convnet, let a=1/sqrt(fan_in),w*=a after generating from Gaussian distribution
    +    //kGaussianSqrtFanIn: from Toronto Convnet, rectified linear activation, 
    +       //let a=sqrt(3)/sqrt(fan_in),range is [-a,+a].
    +       //no need to set value=sqrt(3),the program will multiply it
    +    //kUniformSqrtFanIn: from Theano MLP tutorial, let a=1/sqrt(fan_in+fan_out).
    +       //for tanh activation, range is [-6a,+6a], for sigmoid activation.
    +       // range is [-24a,+24a],put the scale factor to value field
    +    //For Constant Init, use value:float
    +    //For Gaussian Init, use mean:float, std:float
    +    //For Uniform Init, use low:float, high:float
    +}
    +
    +

    Input:n * c_i * h_i * w_i

    +

    Output:n * c_o * h_o * w_o,h_o = (h_i + 2 * pad_h - kernel_h) /stride_h + 1

    +
    +

    Dropout Layer

    +

    Dropout Layer is a layer that randomly dropout some inputs. This scheme helps deep learning model away from over-fitting.

    +
    +

    InnerProduct Layer

    +

    InnerProduct Layer is a fully connected layer which is the basic element in feed forward neural network. It will use the lower layer as a input vector V and output a vector H by doing the following matrix-vector multiplication:

    +

    H = W*V + B // W and B are its weight and bias parameter

    + +
    +
    layer
    +{
    +    name:"IP_Number"
    +    type:"kInnerProduct"
    +    srclayers:"Src_Layer_Name"
    +    inner_product_param
    +    {
    +       num_output:int
    +       //The number of the filters
    +    }
    +    param
    +    {
    +       name:"weight"
    +       init_method:kGaussian|kConstant:kUniform|kPretrained|kGaussianSqrtFanIn|kUniformSqrtFanIn|kUniformSqrtFanInOut
    +       std:float
    +       //
    +       learning_rate_multiplier:float
    +       //          
    +       weight_decay_multiplier:int
    +       //         
    +       /*low:float,high:float*/
    +       //
    +    }
    +    param
    +    {
    +       name:"bias"     
    +       init_method:kConstant|kGaussian|kUniform|kPretrained|kGaussianSqrtFanIn|kUniformSqrtFanIn|kUniformSqrtFanInOut          
    +       learning_rate_mulitiplier:float
    +       //          
    +       weight_decay_multiplier:int
    +       //
    +       value:int
    +       //          
    +       /*low:float,high:float*/
    +       //
    +    }
    +}
    +
    +Input:n * c_i * h_i * w_i
    +Output:n * c_o * 1 *1
    +
    +
    +

    LMDBData Layer

    +

    This is a data input layer, the data will be provided by the LMDB.

    + +
    +
    layer
    +{
    +    name:"data"
    +    type:"kLMDBDate"
    +    data_param
    +    {
    +       path:"LMDB_FILE_PATH"
    +       batchsize:int
    +        //batchsize means the quantity of the input disposable  
    +    }
    +    exclude:kTrain|kValidation|kTest|kPositive|kNegative
    +}
    +
    +
    +

    LRN Layer

    +

    Local Response Normalization normalizes over the local input areas. It provides two modes: WITHIN_CHANNEL and ACROSS_CHANNELS. The local response normalization layer performs a kind of “lateral inhibition” by normalizing over local input regions. In ACROSS_CHANNELS mode, the local regions extend across nearby channels, but have no spatial extent (i.e., they have shape local_size x 1 x 1). In WITHIN_CHANNEL mode, the local regions extend spatially, but are in separate channels (i.e., they have shape 1 x local_size x local_size). Each input value is divided by ![](http://i.imgur.com/GgTjjtR.png), where n is the size of each local region, and the sum is taken over the region centered at that value (zero padding is added where necessary).

    + +
    +
    layer
    +{
    +    name:"Norm_Number"
    +    type:"kLRN"
    +    lrn_param
    +    {
    +       norm_region:WITHIN_CHANNEL|ACROSS_CHANNELS
    +       local_size:int
    +       //for WITHIN_CHANNEL, it means the side length of the space region which will be summed up
    +       //for ACROSS_CHANNELS, it means the quantity of the adjoining channels which will be summed up
    +       alpha:5e-05
    +       beta:float
    +    }
    +    srclayers:"Src_Layer_Name"
    +}
    +
    +
    +

    MnistImage Layer

    +

    MnistImage is a pre-processing layer for MNIST dataset.

    + +
    +
    layer
    +{
    +    name:"mnist"
    +    type:"kMnistImage"
    +    srclayers:"data"
    +    mnist_param
    +    {
    +       sigma:int
    +       alpha:int
    +       gamma:int
    +       kernel:int
    +       elastic_freq:int
    +       beta:int
    +       resize:int
    +       norm_a:int
    +    }
    +}
    +
    +
    +

    Pooling Layer

    +

    Max Pooling uses a specific scanning window to find the max value.
    Average Pooling scans all the values in the window to calculate the average value.

    + +
    +
    layer
    +{
    +    name:"Pool_Number"
    +    type:"kPooling"
    +    srclayers:"Src_Layer_Name"
    +    pooling_param
    +    {
    +       pool:AVE|MAX
    +       //Choose whether use the Average Pooling or Max Pooling
    +       kernel:int
    +       //size of the kernel filter
    +       stride:int
    +       //the step length of the filter
    +    }
    +}
    +
    +
    +

    ReLU Layer

    +

    The rectifier function is an activation function f(x) = Max(0, x) which can be used by neurons just like any other activation function, a node using the rectifier activation function is called a ReLu node. The main reason that it is used is because of how efficiently it can be computed compared to more conventional activation functions like the sigmoid and hyperbolic tangent, without making a significant difference to generalization accuracy. The rectifier activation function is used instead of a linear activation function to add non linearity to the network, otherwise the network would only ever be able to compute a linear function.

    + +
    +
    layer
    +{
    +    name:"Relu_Number"
    +    type:"kReLU"
    +    srclayers:"Src_Layer_Name"
    +}
    +
    +
    +

    RGBImage Layer

    +

    RGBImage layer is a pre-processing layer for RGB format images.

    + +
    +
    layer
    +{
    +    name:"rgb"
    +    type:"kRGBImage"
    +    srclayers:"data"
    +    rgbimage_param
    +    {
    +       meanfile:"Image_Mean_File_Path"
    +    }
    +}
    +
    +
    +

    Tanh Layer

    +

    Tanh uses the tanh as activation function. It transforms the input into range [-1, 1] using Tanh function.

    + +
    +
    layer
    +{
    +    name:"Tanh_Number"
    +    type:"kTanh"
    +    srclayer:"Src_Layer_Name"
    +}
    +
    +
    +

    SoftmaxLoss Layer

    +

    Softmax Loss Layer is the implementation of multi-class softmax loss function. It is generally used as the final layer to generate labels for classification tasks.

    + +
    +
    layer
    +{
    +    name:"loss"
    +    type:"kSoftmaxLoss"
    +    softmaxloss_param
    +    {
    +       topk:int
    +    }
    +    srclayers:"Src_Layer_Name"
    +    srclayers:"Src_Layer_Name"
    +}
    +
    +
    +

    BridgeSrc & BridgeDst Layer

    +

    BridgeSrc & BridgeDst Layer are utility layers implementing logics of model partition. It can be used as a lock for synchronization, a transformation storage of different type of model partition and etc.

    +
    +

    Concate Layer

    +

    Concat Layer is used to concatenate the last dimension (namely, num_feature) of the output of two nodes. It is usually used along with fully connected layer.

    +
    +

    Parser Layer

    +

    Parser Layer will parse the input records into Blobs.

    +
    +

    Prefetch Layer

    +

    Prefetch Layer is used to pre-fetch data from disk. It ensures that the I/O task and computation/communication task can work simultaneously.

    + +
    +
    layer
    +{
    +    name:"prefetch"
    +    type:"kPrefetch"
    +    sublayers
    +    {
    +       name:"data"
    +       type:"kShardData"
    +       data_param
    +       {
    +         path:"Shard_File_Path"
    +         batchsize:int
    +       }
    +    }
    +    sublayers
    +    {
    +       name:"rgb"
    +       type:"kRGBImage"
    +       srclayers:"data"
    +       rgbimage_param
    +       {
    +         meanfile:"Image_Mean_File_Path"
    +       }
    +    }
    +    sublayers
    +    {
    +       name:"label"
    +       type:"kLabel"
    +       srclayers:"data"
    +    }
    +    exclude:kTrain|kValidation|kTest|kPositive|kNegative
    +}
    +
    +
    +

    Slice Layer

    +

    The Slice layer is a utility layer that slices an input layer to multiple output layers along a given dimension (currently num or channel only) with given slice indices.

    +
    +

    Split Layer

    +

    The Split Layer can seperate the input blob into several output blobs. It is used to the situation which one input blob should be input to several other output blobs.

    Modified: websites/staging/singa/trunk/content/docs/lmdb.html ============================================================================== --- websites/staging/singa/trunk/content/docs/lmdb.html (original) +++ websites/staging/singa/trunk/content/docs/lmdb.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Modified: websites/staging/singa/trunk/content/docs/mlp.html ============================================================================== --- websites/staging/singa/trunk/content/docs/mlp.html (original) +++ websites/staging/singa/trunk/content/docs/mlp.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Modified: websites/staging/singa/trunk/content/docs/model-config.html ============================================================================== --- websites/staging/singa/trunk/content/docs/model-config.html (original) +++ websites/staging/singa/trunk/content/docs/model-config.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Model Configuration Modified: websites/staging/singa/trunk/content/docs/neuralnet-partition.html ============================================================================== --- websites/staging/singa/trunk/content/docs/neuralnet-partition.html (original) +++ websites/staging/singa/trunk/content/docs/neuralnet-partition.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Neural Network Partition Modified: websites/staging/singa/trunk/content/docs/neuralnet.html ============================================================================== --- websites/staging/singa/trunk/content/docs/neuralnet.html (original) +++ websites/staging/singa/trunk/content/docs/neuralnet.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Modified: websites/staging/singa/trunk/content/docs/programmer-guide.html ============================================================================== --- websites/staging/singa/trunk/content/docs/programmer-guide.html (original) +++ websites/staging/singa/trunk/content/docs/programmer-guide.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Programmer Guide Modified: websites/staging/singa/trunk/content/docs/rbm.html ============================================================================== --- websites/staging/singa/trunk/content/docs/rbm.html (original) +++ websites/staging/singa/trunk/content/docs/rbm.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Modified: websites/staging/singa/trunk/content/docs/rnn.html ============================================================================== --- websites/staging/singa/trunk/content/docs/rnn.html (original) +++ websites/staging/singa/trunk/content/docs/rnn.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Modified: websites/staging/singa/trunk/content/index.html ============================================================================== --- websites/staging/singa/trunk/content/index.html (original) +++ websites/staging/singa/trunk/content/index.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Welcome to Apache SINGA Modified: websites/staging/singa/trunk/content/introduction.html ============================================================================== --- websites/staging/singa/trunk/content/introduction.html (original) +++ websites/staging/singa/trunk/content/introduction.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Introduction Modified: websites/staging/singa/trunk/content/quick-start.html ============================================================================== --- websites/staging/singa/trunk/content/quick-start.html (original) +++ websites/staging/singa/trunk/content/quick-start.html Fri Aug 7 09:11:25 2015 @@ -1,13 +1,13 @@ - + Apache SINGA – Quick Start