singa-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "ASF subversion and git services (JIRA)" <>
Subject [jira] [Commented] (SINGA-275) Add Cross Entropy Loss for multiple labels
Date Tue, 29 Nov 2016 05:31:58 GMT


ASF subversion and git services commented on SINGA-275:

Commit d1110c0b7101fff6999db1dd5cccb14bf8370578 in incubator-singa's branch refs/heads/master
from [~wangwei.cs]
[;h=d1110c0 ]

SINGA-275 - Add Cross Entropy Loss for multiple labels

Updated the softmax cross entorpy loss layer and the tensor functions to enable
the ground truth be an binary array for each instance;

Added unittests for cross entropy with multiple labels per instance;

For input of a batch of instances, the ground truth tensor could be either an integer array,
one value per
instance, or a binary matrix one row per instance.

For a single instance input, the feature tensor is 1-d array, and the
ground truth tensor is a 1-d array (with a single integer value or a
binary array)

> Add Cross Entropy Loss for multiple labels
> ------------------------------------------
>                 Key: SINGA-275
>                 URL:
>             Project: Singa
>          Issue Type: New Feature
>            Reporter: RUAN PINGCHENG
>            Priority: Critical
> A cross entropy loss function can consider for multiple labels. Normalize the label vector
during training and testing, e.g, <1, 0, 0, 1> -> <0.5, 0, 0, 0.5>. 

This message was sent by Atlassian JIRA

View raw message