labs-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From tomm...@apache.org
Subject svn commit: r1707664 - in /labs/yay/trunk/core/src/test/java/org/apache/yay/core: Word2VecTest.java WordVectorsTest.java
Date Fri, 09 Oct 2015 06:51:24 GMT
Author: tommaso
Date: Fri Oct  9 06:51:24 2015
New Revision: 1707664

URL: http://svn.apache.org/viewvc?rev=1707664&view=rev
Log:
more appropriate test name, using sigmoid on output layer until softmax is there

Added:
    labs/yay/trunk/core/src/test/java/org/apache/yay/core/WordVectorsTest.java
      - copied, changed from r1707564, labs/yay/trunk/core/src/test/java/org/apache/yay/core/Word2VecTest.java
Removed:
    labs/yay/trunk/core/src/test/java/org/apache/yay/core/Word2VecTest.java

Copied: labs/yay/trunk/core/src/test/java/org/apache/yay/core/WordVectorsTest.java (from r1707564,
labs/yay/trunk/core/src/test/java/org/apache/yay/core/Word2VecTest.java)
URL: http://svn.apache.org/viewvc/labs/yay/trunk/core/src/test/java/org/apache/yay/core/WordVectorsTest.java?p2=labs/yay/trunk/core/src/test/java/org/apache/yay/core/WordVectorsTest.java&p1=labs/yay/trunk/core/src/test/java/org/apache/yay/core/Word2VecTest.java&r1=1707564&r2=1707664&rev=1707664&view=diff
==============================================================================
--- labs/yay/trunk/core/src/test/java/org/apache/yay/core/Word2VecTest.java (original)
+++ labs/yay/trunk/core/src/test/java/org/apache/yay/core/WordVectorsTest.java Fri Oct  9
06:51:24 2015
@@ -53,7 +53,7 @@ import static org.junit.Assert.*;
 /**
  * Integration test for using Yay to implement word vectors algorithms.
  */
-public class Word2VecTest {
+public class WordVectorsTest {
 
   @Test
   public void testSGM() throws Exception {
@@ -76,7 +76,7 @@ public class Word2VecTest {
     Map<Integer, ActivationFunction<Double>> activationFunctions = new HashMap<Integer,
ActivationFunction<Double>>();
     activationFunctions.put(0, new IdentityActivationFunction<Double>());
     // TODO : place a softmax activation for the output layer
-    activationFunctions.put(0, new IdentityActivationFunction<Double>());
+    activationFunctions.put(1, new SigmoidFunction());
     FeedForwardStrategy predictionStrategy = new FeedForwardStrategy(activationFunctions);
     BackPropagationLearningStrategy learningStrategy = new BackPropagationLearningStrategy(0.05d,
10,
             BackPropagationLearningStrategy.DEFAULT_THRESHOLD, predictionStrategy, new LMSCostFunction(),



---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@labs.apache.org
For additional commands, e-mail: commits-help@labs.apache.org


Mime
View raw message